41684 1727204443.27631: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-G1p executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 41684 1727204443.28253: Added group all to inventory 41684 1727204443.28256: Added group ungrouped to inventory 41684 1727204443.28260: Group all now contains ungrouped 41684 1727204443.28269: Examining possible inventory source: /tmp/network-M6W/inventory-5vW.yml 41684 1727204443.42161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 41684 1727204443.42215: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 41684 1727204443.42237: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 41684 1727204443.42280: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 41684 1727204443.42328: Loaded config def from plugin (inventory/script) 41684 1727204443.42330: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 41684 1727204443.42358: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 41684 1727204443.42417: Loaded config def from plugin (inventory/yaml) 41684 1727204443.42419: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 41684 1727204443.42483: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 41684 1727204443.42769: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 41684 1727204443.42771: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 41684 1727204443.42774: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 41684 1727204443.42779: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 41684 1727204443.42783: Loading data from /tmp/network-M6W/inventory-5vW.yml 41684 1727204443.42825: /tmp/network-M6W/inventory-5vW.yml was not parsable by auto 41684 1727204443.42872: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 41684 1727204443.42903: Loading data from /tmp/network-M6W/inventory-5vW.yml 41684 1727204443.42956: group all already in inventory 41684 1727204443.42961: set inventory_file for managed-node1 41684 1727204443.42967: set inventory_dir for managed-node1 41684 1727204443.42968: Added host managed-node1 to inventory 41684 1727204443.42969: Added host managed-node1 to group all 41684 1727204443.42970: set ansible_host for managed-node1 41684 1727204443.42970: set ansible_ssh_extra_args for managed-node1 41684 1727204443.42973: set inventory_file for managed-node2 41684 1727204443.42975: set inventory_dir for managed-node2 41684 1727204443.42975: Added host managed-node2 to inventory 41684 1727204443.42976: Added host managed-node2 to group all 41684 1727204443.42977: set ansible_host for managed-node2 41684 1727204443.42977: set ansible_ssh_extra_args for managed-node2 41684 1727204443.42979: set inventory_file for managed-node3 41684 1727204443.42980: set inventory_dir for managed-node3 41684 1727204443.42981: Added host managed-node3 to inventory 41684 1727204443.42981: Added host managed-node3 to group all 41684 1727204443.42982: set ansible_host for managed-node3 41684 1727204443.42982: set ansible_ssh_extra_args for managed-node3 41684 1727204443.42984: Reconcile groups and hosts in inventory. 41684 1727204443.42987: Group ungrouped now contains managed-node1 41684 1727204443.42988: Group ungrouped now contains managed-node2 41684 1727204443.42989: Group ungrouped now contains managed-node3 41684 1727204443.43041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 41684 1727204443.43131: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 41684 1727204443.43161: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 41684 1727204443.43181: Loaded config def from plugin (vars/host_group_vars) 41684 1727204443.43183: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 41684 1727204443.43188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 41684 1727204443.43193: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 41684 1727204443.43223: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 41684 1727204443.43467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204443.43530: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 41684 1727204443.43555: Loaded config def from plugin (connection/local) 41684 1727204443.43559: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 41684 1727204443.44067: Loaded config def from plugin (connection/paramiko_ssh) 41684 1727204443.44070: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 41684 1727204443.44920: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 41684 1727204443.44955: Loaded config def from plugin (connection/psrp) 41684 1727204443.44958: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 41684 1727204443.45619: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 41684 1727204443.45655: Loaded config def from plugin (connection/ssh) 41684 1727204443.45657: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 41684 1727204443.45966: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 41684 1727204443.46000: Loaded config def from plugin (connection/winrm) 41684 1727204443.46003: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 41684 1727204443.46030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 41684 1727204443.46088: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 41684 1727204443.46147: Loaded config def from plugin (shell/cmd) 41684 1727204443.46149: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 41684 1727204443.46173: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 41684 1727204443.46230: Loaded config def from plugin (shell/powershell) 41684 1727204443.46231: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 41684 1727204443.46281: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 41684 1727204443.46435: Loaded config def from plugin (shell/sh) 41684 1727204443.46438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 41684 1727204443.46470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 41684 1727204443.46582: Loaded config def from plugin (become/runas) 41684 1727204443.46584: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 41684 1727204443.46787: Loaded config def from plugin (become/su) 41684 1727204443.46789: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 41684 1727204443.46935: Loaded config def from plugin (become/sudo) 41684 1727204443.46937: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 41684 1727204443.46969: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml 41684 1727204443.47268: in VariableManager get_vars() 41684 1727204443.47288: done with get_vars() 41684 1727204443.47406: trying /usr/local/lib/python3.12/site-packages/ansible/modules 41684 1727204443.49937: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 41684 1727204443.50020: in VariableManager get_vars() 41684 1727204443.50023: done with get_vars() 41684 1727204443.50026: variable 'playbook_dir' from source: magic vars 41684 1727204443.50026: variable 'ansible_playbook_python' from source: magic vars 41684 1727204443.50027: variable 'ansible_config_file' from source: magic vars 41684 1727204443.50027: variable 'groups' from source: magic vars 41684 1727204443.50029: variable 'omit' from source: magic vars 41684 1727204443.50029: variable 'ansible_version' from source: magic vars 41684 1727204443.50030: variable 'ansible_check_mode' from source: magic vars 41684 1727204443.50030: variable 'ansible_diff_mode' from source: magic vars 41684 1727204443.50031: variable 'ansible_forks' from source: magic vars 41684 1727204443.50031: variable 'ansible_inventory_sources' from source: magic vars 41684 1727204443.50032: variable 'ansible_skip_tags' from source: magic vars 41684 1727204443.50032: variable 'ansible_limit' from source: magic vars 41684 1727204443.50033: variable 'ansible_run_tags' from source: magic vars 41684 1727204443.50033: variable 'ansible_verbosity' from source: magic vars 41684 1727204443.50060: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml 41684 1727204443.50625: in VariableManager get_vars() 41684 1727204443.50637: done with get_vars() 41684 1727204443.50660: in VariableManager get_vars() 41684 1727204443.50671: done with get_vars() 41684 1727204443.50693: in VariableManager get_vars() 41684 1727204443.50700: done with get_vars() 41684 1727204443.50772: in VariableManager get_vars() 41684 1727204443.50781: done with get_vars() 41684 1727204443.50803: in VariableManager get_vars() 41684 1727204443.50810: done with get_vars() 41684 1727204443.50831: in VariableManager get_vars() 41684 1727204443.50849: done with get_vars() 41684 1727204443.50888: in VariableManager get_vars() 41684 1727204443.50897: done with get_vars() 41684 1727204443.50900: variable 'omit' from source: magic vars 41684 1727204443.50911: variable 'omit' from source: magic vars 41684 1727204443.50931: in VariableManager get_vars() 41684 1727204443.50937: done with get_vars() 41684 1727204443.50970: in VariableManager get_vars() 41684 1727204443.50979: done with get_vars() 41684 1727204443.51003: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 41684 1727204443.51132: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 41684 1727204443.51212: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 41684 1727204443.51582: in VariableManager get_vars() 41684 1727204443.51595: done with get_vars() 41684 1727204443.51883: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 41684 1727204443.51971: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41684 1727204443.53302: in VariableManager get_vars() 41684 1727204443.53314: done with get_vars() 41684 1727204443.53317: variable 'omit' from source: magic vars 41684 1727204443.53325: variable 'omit' from source: magic vars 41684 1727204443.53345: in VariableManager get_vars() 41684 1727204443.53355: done with get_vars() 41684 1727204443.53372: in VariableManager get_vars() 41684 1727204443.53382: done with get_vars() 41684 1727204443.53402: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 41684 1727204443.53470: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 41684 1727204443.53513: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 41684 1727204443.53750: in VariableManager get_vars() 41684 1727204443.53769: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41684 1727204443.55721: in VariableManager get_vars() 41684 1727204443.55744: done with get_vars() 41684 1727204443.55788: in VariableManager get_vars() 41684 1727204443.55807: done with get_vars() 41684 1727204443.55919: in VariableManager get_vars() 41684 1727204443.55958: done with get_vars() 41684 1727204443.56001: in VariableManager get_vars() 41684 1727204443.56020: done with get_vars() 41684 1727204443.56059: in VariableManager get_vars() 41684 1727204443.56083: done with get_vars() 41684 1727204443.56121: in VariableManager get_vars() 41684 1727204443.56140: done with get_vars() 41684 1727204443.56205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 41684 1727204443.56221: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 41684 1727204443.56480: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 41684 1727204443.56651: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 41684 1727204443.56654: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 41684 1727204443.56692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 41684 1727204443.56718: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 41684 1727204443.56901: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 41684 1727204443.56966: Loaded config def from plugin (callback/default) 41684 1727204443.56969: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 41684 1727204443.59656: Loaded config def from plugin (callback/junit) 41684 1727204443.59659: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 41684 1727204443.59700: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 41684 1727204443.59738: Loaded config def from plugin (callback/minimal) 41684 1727204443.59740: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 41684 1727204443.59771: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 41684 1727204443.59812: Loaded config def from plugin (callback/tree) 41684 1727204443.59814: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 41684 1727204443.59891: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 41684 1727204443.59893: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_route_device_nm.yml ******************************************** 2 plays in /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml 41684 1727204443.59912: in VariableManager get_vars() 41684 1727204443.59922: done with get_vars() 41684 1727204443.59925: in VariableManager get_vars() 41684 1727204443.59930: done with get_vars() 41684 1727204443.59932: variable 'omit' from source: magic vars 41684 1727204443.59954: in VariableManager get_vars() 41684 1727204443.59966: done with get_vars() 41684 1727204443.59981: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_route_device.yml' with nm as provider] ***** 41684 1727204443.60390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 41684 1727204443.60468: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 41684 1727204443.60497: getting the remaining hosts for this loop 41684 1727204443.60499: done getting the remaining hosts for this loop 41684 1727204443.60502: getting the next task for host managed-node1 41684 1727204443.60505: done getting next task for host managed-node1 41684 1727204443.60506: ^ task is: TASK: Gathering Facts 41684 1727204443.60508: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204443.60510: getting variables 41684 1727204443.60511: in VariableManager get_vars() 41684 1727204443.60520: Calling all_inventory to load vars for managed-node1 41684 1727204443.60523: Calling groups_inventory to load vars for managed-node1 41684 1727204443.60525: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204443.60536: Calling all_plugins_play to load vars for managed-node1 41684 1727204443.60546: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204443.60549: Calling groups_plugins_play to load vars for managed-node1 41684 1727204443.60585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204443.60634: done with get_vars() 41684 1727204443.60639: done getting variables 41684 1727204443.60704: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:6 Tuesday 24 September 2024 15:00:43 -0400 (0:00:00.008) 0:00:00.008 ***** 41684 1727204443.60719: entering _queue_task() for managed-node1/gather_facts 41684 1727204443.60720: Creating lock for gather_facts 41684 1727204443.60992: worker is 1 (out of 1 available) 41684 1727204443.61005: exiting _queue_task() for managed-node1/gather_facts 41684 1727204443.61017: done queuing things up, now waiting for results queue to drain 41684 1727204443.61019: waiting for pending results... 41684 1727204443.61129: running TaskExecutor() for managed-node1/TASK: Gathering Facts 41684 1727204443.61191: in run() - task 0affcd87-79f5-3839-086d-0000000000bf 41684 1727204443.61203: variable 'ansible_search_path' from source: unknown 41684 1727204443.61231: calling self._execute() 41684 1727204443.61282: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204443.61293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204443.61301: variable 'omit' from source: magic vars 41684 1727204443.61376: variable 'omit' from source: magic vars 41684 1727204443.61396: variable 'omit' from source: magic vars 41684 1727204443.61419: variable 'omit' from source: magic vars 41684 1727204443.61455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204443.62072: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204443.62076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204443.62078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204443.62080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204443.62082: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204443.62083: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204443.62086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204443.62088: Set connection var ansible_connection to ssh 41684 1727204443.62089: Set connection var ansible_pipelining to False 41684 1727204443.62091: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204443.62093: Set connection var ansible_timeout to 10 41684 1727204443.62095: Set connection var ansible_shell_executable to /bin/sh 41684 1727204443.62096: Set connection var ansible_shell_type to sh 41684 1727204443.62098: variable 'ansible_shell_executable' from source: unknown 41684 1727204443.62100: variable 'ansible_connection' from source: unknown 41684 1727204443.62102: variable 'ansible_module_compression' from source: unknown 41684 1727204443.62103: variable 'ansible_shell_type' from source: unknown 41684 1727204443.62105: variable 'ansible_shell_executable' from source: unknown 41684 1727204443.62107: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204443.62108: variable 'ansible_pipelining' from source: unknown 41684 1727204443.62110: variable 'ansible_timeout' from source: unknown 41684 1727204443.62112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204443.62114: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204443.62116: variable 'omit' from source: magic vars 41684 1727204443.62118: starting attempt loop 41684 1727204443.62119: running the handler 41684 1727204443.62121: variable 'ansible_facts' from source: unknown 41684 1727204443.62122: _low_level_execute_command(): starting 41684 1727204443.62124: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204443.62632: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204443.62642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204443.62691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204443.62694: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204443.62697: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204443.62740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204443.62753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204443.62827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204443.64477: stdout chunk (state=3): >>>/root <<< 41684 1727204443.64582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204443.64642: stderr chunk (state=3): >>><<< 41684 1727204443.64645: stdout chunk (state=3): >>><<< 41684 1727204443.64667: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204443.64682: _low_level_execute_command(): starting 41684 1727204443.64690: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204443.6466987-41921-219615563569101 `" && echo ansible-tmp-1727204443.6466987-41921-219615563569101="` echo /root/.ansible/tmp/ansible-tmp-1727204443.6466987-41921-219615563569101 `" ) && sleep 0' 41684 1727204443.65158: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204443.65176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204443.65197: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204443.65232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204443.65256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204443.65279: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204443.65291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204443.65358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204443.67225: stdout chunk (state=3): >>>ansible-tmp-1727204443.6466987-41921-219615563569101=/root/.ansible/tmp/ansible-tmp-1727204443.6466987-41921-219615563569101 <<< 41684 1727204443.67368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204443.67456: stderr chunk (state=3): >>><<< 41684 1727204443.67460: stdout chunk (state=3): >>><<< 41684 1727204443.67769: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204443.6466987-41921-219615563569101=/root/.ansible/tmp/ansible-tmp-1727204443.6466987-41921-219615563569101 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204443.67774: variable 'ansible_module_compression' from source: unknown 41684 1727204443.67777: ANSIBALLZ: Using generic lock for ansible.legacy.setup 41684 1727204443.67780: ANSIBALLZ: Acquiring lock 41684 1727204443.67783: ANSIBALLZ: Lock acquired: 139842516808240 41684 1727204443.67785: ANSIBALLZ: Creating module 41684 1727204444.21068: ANSIBALLZ: Writing module into payload 41684 1727204444.21541: ANSIBALLZ: Writing module 41684 1727204444.21584: ANSIBALLZ: Renaming module 41684 1727204444.21650: ANSIBALLZ: Done creating module 41684 1727204444.21677: variable 'ansible_facts' from source: unknown 41684 1727204444.21689: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204444.21730: _low_level_execute_command(): starting 41684 1727204444.21741: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 41684 1727204444.23198: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204444.23202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204444.23235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204444.23239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204444.23243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204444.23972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204444.23985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204444.24067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204444.25690: stdout chunk (state=3): >>>PLATFORM <<< 41684 1727204444.25781: stdout chunk (state=3): >>>Linux <<< 41684 1727204444.25785: stdout chunk (state=3): >>>FOUND /usr/bin/python3.9 <<< 41684 1727204444.25787: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 41684 1727204444.25921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204444.26007: stderr chunk (state=3): >>><<< 41684 1727204444.26012: stdout chunk (state=3): >>><<< 41684 1727204444.26070: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204444.26076 [managed-node1]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 41684 1727204444.26172: _low_level_execute_command(): starting 41684 1727204444.26176: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 41684 1727204444.26336: Sending initial data 41684 1727204444.26340: Sent initial data (1181 bytes) 41684 1727204444.27614: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204444.27634: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204444.27646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204444.27661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204444.27819: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204444.27826: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204444.27837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204444.27851: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204444.27858: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204444.27868: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204444.27877: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204444.27887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204444.27899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204444.27905: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204444.27914: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204444.27925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204444.28018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204444.28152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204444.28172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204444.28259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204444.32028: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 41684 1727204444.32490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204444.32494: stdout chunk (state=3): >>><<< 41684 1727204444.32500: stderr chunk (state=3): >>><<< 41684 1727204444.32514: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204444.32594: variable 'ansible_facts' from source: unknown 41684 1727204444.32598: variable 'ansible_facts' from source: unknown 41684 1727204444.32607: variable 'ansible_module_compression' from source: unknown 41684 1727204444.32654: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41684 1727204444.32684: variable 'ansible_facts' from source: unknown 41684 1727204444.32837: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204443.6466987-41921-219615563569101/AnsiballZ_setup.py 41684 1727204444.33445: Sending initial data 41684 1727204444.33449: Sent initial data (154 bytes) 41684 1727204444.36193: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204444.36238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204444.36250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204444.36268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204444.36310: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204444.36383: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204444.36400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204444.36415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204444.36423: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204444.36430: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204444.36438: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204444.36453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204444.36462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204444.36476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204444.36483: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204444.36495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204444.36572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204444.36735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204444.36746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204444.36838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204444.38537: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204444.38593: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204444.38643: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmppnsckr2o /root/.ansible/tmp/ansible-tmp-1727204443.6466987-41921-219615563569101/AnsiballZ_setup.py <<< 41684 1727204444.38693: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204444.42257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204444.42270: stderr chunk (state=3): >>><<< 41684 1727204444.42274: stdout chunk (state=3): >>><<< 41684 1727204444.42300: done transferring module to remote 41684 1727204444.42315: _low_level_execute_command(): starting 41684 1727204444.42318: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204443.6466987-41921-219615563569101/ /root/.ansible/tmp/ansible-tmp-1727204443.6466987-41921-219615563569101/AnsiballZ_setup.py && sleep 0' 41684 1727204444.44041: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204444.44296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204444.44300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204444.44307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204444.44347: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204444.44353: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204444.44363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204444.44382: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204444.44390: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204444.44396: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204444.44403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204444.44412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204444.44423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204444.44431: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204444.44437: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204444.44448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204444.44734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204444.44754: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204444.44770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204444.44860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204444.46675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204444.46679: stdout chunk (state=3): >>><<< 41684 1727204444.46685: stderr chunk (state=3): >>><<< 41684 1727204444.46701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204444.46704: _low_level_execute_command(): starting 41684 1727204444.46710: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204443.6466987-41921-219615563569101/AnsiballZ_setup.py && sleep 0' 41684 1727204444.49168: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204444.49397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204444.49415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204444.49434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204444.49509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204444.49522: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204444.49611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204444.49631: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204444.49646: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204444.49659: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204444.49677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204444.49692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204444.49712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204444.49834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204444.49846: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204444.49859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204444.50168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204444.50186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204444.50201: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204444.50385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204444.52297: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 41684 1727204444.52301: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 41684 1727204444.52360: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 41684 1727204444.52401: stdout chunk (state=3): >>>import 'posix' # <<< 41684 1727204444.52427: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 41684 1727204444.52430: stdout chunk (state=3): >>># installing zipimport hook <<< 41684 1727204444.52470: stdout chunk (state=3): >>>import 'time' # <<< 41684 1727204444.52482: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 41684 1727204444.52532: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204444.52571: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 41684 1727204444.52575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc'<<< 41684 1727204444.52597: stdout chunk (state=3): >>> import '_codecs' # <<< 41684 1727204444.52600: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d89b3dc0> <<< 41684 1727204444.52657: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 41684 1727204444.52672: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 41684 1727204444.52676: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d89583a0> <<< 41684 1727204444.52678: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d89b3b20> <<< 41684 1727204444.52707: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 41684 1727204444.52712: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d89b3ac0> <<< 41684 1727204444.52766: stdout chunk (state=3): >>>import '_signal' # <<< 41684 1727204444.52773: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py <<< 41684 1727204444.52778: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 41684 1727204444.52780: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8958490><<< 41684 1727204444.52782: stdout chunk (state=3): >>> <<< 41684 1727204444.52816: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 41684 1727204444.52822: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 41684 1727204444.52839: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8958940> <<< 41684 1727204444.52855: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8958670> <<< 41684 1727204444.52898: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 41684 1727204444.52901: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 41684 1727204444.52919: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 41684 1727204444.52958: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 41684 1727204444.52966: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 41684 1727204444.52984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 41684 1727204444.53048: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d890f190> <<< 41684 1727204444.53051: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 41684 1727204444.53117: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d890f220> <<< 41684 1727204444.53146: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 41684 1727204444.53169: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 41684 1727204444.53187: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py <<< 41684 1727204444.53191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8932850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d890f940> <<< 41684 1727204444.53207: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8970880> <<< 41684 1727204444.53229: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 41684 1727204444.53232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8908d90> <<< 41684 1727204444.53299: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 41684 1727204444.53303: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8932d90> <<< 41684 1727204444.53362: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8958970> <<< 41684 1727204444.53387: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 41684 1727204444.53709: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 41684 1727204444.53740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 41684 1727204444.53743: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 41684 1727204444.53758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 41684 1727204444.53775: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 41684 1727204444.53814: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 41684 1727204444.53818: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 41684 1727204444.53829: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88aef10> <<< 41684 1727204444.53897: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88b40a0> <<< 41684 1727204444.53901: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 41684 1727204444.53917: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 41684 1727204444.53939: stdout chunk (state=3): >>>import '_sre' # <<< 41684 1727204444.53958: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 41684 1727204444.53967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 41684 1727204444.53984: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 41684 1727204444.54010: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88a75b0> <<< 41684 1727204444.54028: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88af6a0> <<< 41684 1727204444.54053: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88ae3d0> <<< 41684 1727204444.54057: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 41684 1727204444.54130: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 41684 1727204444.54148: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 41684 1727204444.54198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204444.54201: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 41684 1727204444.54203: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 41684 1727204444.54241: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204444.54245: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d8831e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8831910> <<< 41684 1727204444.54286: stdout chunk (state=3): >>>import 'itertools' # <<< 41684 1727204444.54290: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' <<< 41684 1727204444.54292: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8831f10> <<< 41684 1727204444.54315: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 41684 1727204444.54321: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 41684 1727204444.54343: stdout chunk (state=3): >>>import '_operator' # <<< 41684 1727204444.54349: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8831fd0> <<< 41684 1727204444.54371: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 41684 1727204444.54392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88420d0> <<< 41684 1727204444.54398: stdout chunk (state=3): >>>import '_collections' # <<< 41684 1727204444.54439: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8889d90> <<< 41684 1727204444.54443: stdout chunk (state=3): >>>import '_functools' # <<< 41684 1727204444.54466: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8882670> <<< 41684 1727204444.54527: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 41684 1727204444.54531: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88956d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88b5e80> <<< 41684 1727204444.54543: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 41684 1727204444.54579: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204444.54593: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d8842cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88892b0> <<< 41684 1727204444.54623: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204444.54638: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d88952e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88bba30> <<< 41684 1727204444.54653: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 41684 1727204444.54694: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204444.54715: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 41684 1727204444.54738: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8842eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8842df0> <<< 41684 1727204444.54778: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' <<< 41684 1727204444.54796: stdout chunk (state=3): >>>import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8842d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 41684 1727204444.54811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 41684 1727204444.54828: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 41684 1727204444.54848: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 41684 1727204444.54858: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 41684 1727204444.54909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 41684 1727204444.54928: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py <<< 41684 1727204444.54951: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d85723d0> <<< 41684 1727204444.54974: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 41684 1727204444.55008: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d85724c0> <<< 41684 1727204444.55128: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d884af40> <<< 41684 1727204444.55175: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8844a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8844490> <<< 41684 1727204444.55206: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 41684 1727204444.55246: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 41684 1727204444.55257: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 41684 1727204444.55286: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 41684 1727204444.55298: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d848d220> <<< 41684 1727204444.55329: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d855d520> <<< 41684 1727204444.55379: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8844f10> <<< 41684 1727204444.55397: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88bb0a0> <<< 41684 1727204444.55406: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 41684 1727204444.55429: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 41684 1727204444.55449: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 41684 1727204444.55467: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d849eb50> <<< 41684 1727204444.55480: stdout chunk (state=3): >>>import 'errno' # <<< 41684 1727204444.55507: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204444.55524: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d849ee80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 41684 1727204444.55541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 41684 1727204444.55555: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 41684 1727204444.55575: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d84b0790> <<< 41684 1727204444.55596: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 41684 1727204444.55623: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 41684 1727204444.55657: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d84b0cd0> <<< 41684 1727204444.55693: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204444.55710: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d843e400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d849ef70> <<< 41684 1727204444.55731: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 41684 1727204444.55743: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 41684 1727204444.55780: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204444.55794: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d844e2e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d84b0610> import 'pwd' # <<< 41684 1727204444.55822: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204444.55837: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d844e3a0> <<< 41684 1727204444.55862: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8842a30> <<< 41684 1727204444.55885: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 41684 1727204444.55897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 41684 1727204444.55923: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 41684 1727204444.55939: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 41684 1727204444.55969: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d846a700> <<< 41684 1727204444.55988: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 41684 1727204444.56019: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d846a9d0> <<< 41684 1727204444.56031: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d846a7c0> <<< 41684 1727204444.56054: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204444.56078: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d846a8b0> <<< 41684 1727204444.56089: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 41684 1727204444.56281: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204444.56295: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d846ad00> <<< 41684 1727204444.56315: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204444.56331: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d8476250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d846a940> <<< 41684 1727204444.56343: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d845da90> <<< 41684 1727204444.56366: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8842610> <<< 41684 1727204444.56390: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 41684 1727204444.56444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 41684 1727204444.56494: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d846aaf0> <<< 41684 1727204444.56620: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 41684 1727204444.56635: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb5d83936d0> <<< 41684 1727204444.56898: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip' <<< 41684 1727204444.56901: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.56991: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.57023: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 41684 1727204444.57051: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.57058: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 41684 1727204444.57078: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.58327: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.59276: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py <<< 41684 1727204444.59317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7dae820> <<< 41684 1727204444.59323: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 41684 1727204444.59328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 41684 1727204444.59342: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 41684 1727204444.59372: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204444.59376: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7dae160> <<< 41684 1727204444.59411: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7dae280> <<< 41684 1727204444.59448: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7daef70> <<< 41684 1727204444.59466: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 41684 1727204444.59525: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7dae4f0> <<< 41684 1727204444.59529: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7daed90> import 'atexit' # <<< 41684 1727204444.59547: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7daefd0> <<< 41684 1727204444.59573: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 41684 1727204444.59594: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 41684 1727204444.59653: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7dae100> <<< 41684 1727204444.59657: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 41684 1727204444.59675: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 41684 1727204444.59687: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 41684 1727204444.59718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 41684 1727204444.59744: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 41684 1727204444.59747: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 41684 1727204444.59822: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d830d0> <<< 41684 1727204444.59863: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7c88340> <<< 41684 1727204444.59892: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7c88040> <<< 41684 1727204444.59916: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 41684 1727204444.59920: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 41684 1727204444.59974: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7c88ca0> <<< 41684 1727204444.59978: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d96dc0> <<< 41684 1727204444.60143: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d963a0> <<< 41684 1727204444.60169: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 41684 1727204444.60172: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 41684 1727204444.60189: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d96fd0> <<< 41684 1727204444.60209: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 41684 1727204444.60247: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 41684 1727204444.60271: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 41684 1727204444.60291: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 41684 1727204444.60320: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 41684 1727204444.60323: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7de3d30> <<< 41684 1727204444.60397: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7db5d30> <<< 41684 1727204444.60400: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7db5400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d61b20> <<< 41684 1727204444.60441: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204444.60448: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7db5520> <<< 41684 1727204444.60468: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7db5550> <<< 41684 1727204444.60486: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 41684 1727204444.60503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 41684 1727204444.60515: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 41684 1727204444.60559: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 41684 1727204444.60627: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204444.60633: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7cf6fd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7df6250> <<< 41684 1727204444.60660: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 41684 1727204444.60667: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 41684 1727204444.60727: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7cf4850> <<< 41684 1727204444.60730: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7df63d0> <<< 41684 1727204444.60741: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 41684 1727204444.60805: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204444.60808: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 41684 1727204444.60811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 41684 1727204444.60878: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7df6ca0> <<< 41684 1727204444.61004: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7cf47f0> <<< 41684 1727204444.61097: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7d8ec10> <<< 41684 1727204444.61124: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7df6fa0> <<< 41684 1727204444.61175: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7df6550> <<< 41684 1727204444.61212: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7dee910> <<< 41684 1727204444.61220: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py <<< 41684 1727204444.61222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 41684 1727204444.61245: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 41684 1727204444.61296: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204444.61299: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7ce8940> <<< 41684 1727204444.61478: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204444.61481: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7d06d90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7cf2580> <<< 41684 1727204444.61527: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7ce8ee0> <<< 41684 1727204444.61534: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7cf29a0> <<< 41684 1727204444.61537: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.61556: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 41684 1727204444.61576: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.61642: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.61738: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41684 1727204444.61745: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 41684 1727204444.61748: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.61768: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 41684 1727204444.61786: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.61877: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.61975: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.62428: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.62888: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 41684 1727204444.62908: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 41684 1727204444.62928: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204444.62995: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7d367f0> <<< 41684 1727204444.63069: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 41684 1727204444.63087: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d04760> <<< 41684 1727204444.63090: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d787a970> <<< 41684 1727204444.63141: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 41684 1727204444.63144: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.63172: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.63176: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py <<< 41684 1727204444.63187: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.63302: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.63436: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 41684 1727204444.63460: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d6c730> <<< 41684 1727204444.63469: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.63855: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.64278: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41684 1727204444.64343: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py <<< 41684 1727204444.64345: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.64586: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 41684 1727204444.64593: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.64624: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.64652: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 41684 1727204444.64667: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.64850: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.65036: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 41684 1727204444.65057: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 41684 1727204444.65077: stdout chunk (state=3): >>>import '_ast' # <<< 41684 1727204444.65150: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7db1370> <<< 41684 1727204444.65153: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.65211: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.65285: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py <<< 41684 1727204444.65289: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 41684 1727204444.65314: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.65344: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.65388: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available <<< 41684 1727204444.65430: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.65468: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.65559: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.65619: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 41684 1727204444.65647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204444.65726: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204444.65729: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7d23550> <<< 41684 1727204444.65816: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d76f6eb0> <<< 41684 1727204444.65858: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py <<< 41684 1727204444.65861: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.65918: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.65982: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.65996: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.66046: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 41684 1727204444.66053: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 41684 1727204444.66074: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 41684 1727204444.66108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 41684 1727204444.66127: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 41684 1727204444.66144: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 41684 1727204444.66229: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d297f0> <<< 41684 1727204444.66267: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d28790> <<< 41684 1727204444.66334: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d23b50> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 41684 1727204444.66338: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.66350: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.66386: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 41684 1727204444.66458: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 41684 1727204444.66487: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41684 1727204444.66490: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 41684 1727204444.66501: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.66559: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.66613: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.66629: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.66640: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.66686: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.66718: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.66752: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.66794: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 41684 1727204444.66802: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.66859: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.66927: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.66939: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.66980: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 41684 1727204444.67129: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.67270: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.67303: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.67359: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py <<< 41684 1727204444.67366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204444.67389: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 41684 1727204444.67395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 41684 1727204444.67409: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 41684 1727204444.67444: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7840370> <<< 41684 1727204444.67471: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 41684 1727204444.67489: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 41684 1727204444.67519: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 41684 1727204444.67552: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 41684 1727204444.67555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 41684 1727204444.67567: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d785c580> <<< 41684 1727204444.67610: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204444.67613: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d785c4f0> <<< 41684 1727204444.67693: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7830280> <<< 41684 1727204444.67697: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7840970> <<< 41684 1727204444.67722: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d75fb7f0> <<< 41684 1727204444.67725: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d75fbb20> <<< 41684 1727204444.67742: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 41684 1727204444.67753: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 41684 1727204444.67786: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 41684 1727204444.67834: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d789e0a0> <<< 41684 1727204444.67838: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d783df70> <<< 41684 1727204444.67860: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 41684 1727204444.67865: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 41684 1727204444.67897: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d789e190> <<< 41684 1727204444.67915: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 41684 1727204444.67935: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 41684 1727204444.67970: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204444.67975: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7663fd0> <<< 41684 1727204444.67990: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d788c820> <<< 41684 1727204444.68041: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d75fbd60> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 41684 1727204444.68046: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 41684 1727204444.68048: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.68066: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 41684 1727204444.68081: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.68133: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.68180: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 41684 1727204444.68230: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.68273: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 41684 1727204444.68308: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41684 1727204444.68312: stdout chunk (state=3): >>>import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 41684 1727204444.68314: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.68343: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.68375: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 41684 1727204444.68387: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.68421: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.68469: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available <<< 41684 1727204444.68513: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.68546: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 41684 1727204444.68558: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.68612: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.68662: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.68708: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.68768: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 41684 1727204444.68772: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.69160: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.69533: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 41684 1727204444.69536: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.69587: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.69628: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.69650: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.69700: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 41684 1727204444.69706: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.69723: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.69755: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available <<< 41684 1727204444.69815: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.69863: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 41684 1727204444.69872: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.69886: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.69927: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 41684 1727204444.69934: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.69953: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.69990: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 41684 1727204444.69993: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.70052: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.70130: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 41684 1727204444.70155: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7549e80> <<< 41684 1727204444.70172: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 41684 1727204444.70202: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 41684 1727204444.70358: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d75499d0> <<< 41684 1727204444.70361: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 41684 1727204444.70421: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.70486: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 41684 1727204444.70490: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.70558: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.70637: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 41684 1727204444.70703: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.70771: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 41684 1727204444.70785: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.70808: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.70857: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 41684 1727204444.70876: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 41684 1727204444.71017: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d75c1490> <<< 41684 1727204444.71271: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d755a850> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 41684 1727204444.71274: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.71313: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.71360: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available <<< 41684 1727204444.71440: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.71507: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.71605: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.71739: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py <<< 41684 1727204444.71743: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 41684 1727204444.71786: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.71824: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 41684 1727204444.71828: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.71854: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.71904: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 41684 1727204444.71974: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d75be670> <<< 41684 1727204444.71979: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d75be220> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 41684 1727204444.71982: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.72003: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 41684 1727204444.72006: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.72048: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.72091: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 41684 1727204444.72095: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.72215: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.72342: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 41684 1727204444.72433: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.72514: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.72551: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.72593: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 41684 1727204444.72597: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 41684 1727204444.72681: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.72693: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.72810: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.72935: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py <<< 41684 1727204444.72938: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 41684 1727204444.73045: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.73154: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 41684 1727204444.73157: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.73186: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.73217: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.73649: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.74068: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 41684 1727204444.74081: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.74161: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.74250: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 41684 1727204444.74340: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.74430: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 41684 1727204444.74434: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.74556: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.74701: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available <<< 41684 1727204444.74714: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 41684 1727204444.74727: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.74769: stdout chunk (state=3): >>># zipimport: zlib available<<< 41684 1727204444.74780: stdout chunk (state=3): >>> <<< 41684 1727204444.74807: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 41684 1727204444.74820: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.74901: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.74987: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.75156: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.75325: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 41684 1727204444.75338: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.75375: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.75408: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available <<< 41684 1727204444.75438: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.75465: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 41684 1727204444.75469: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.75528: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.75601: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 41684 1727204444.75606: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.75622: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.75648: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 41684 1727204444.75651: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.75704: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.75757: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 41684 1727204444.75761: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.75811: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.75869: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 41684 1727204444.75873: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.76083: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.76297: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available <<< 41684 1727204444.76355: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.76416: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 41684 1727204444.76419: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.76443: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.76481: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available <<< 41684 1727204444.76517: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.76544: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 41684 1727204444.76557: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.76584: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.76620: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 41684 1727204444.76623: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.76692: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.76783: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 41684 1727204444.76787: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.76789: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 41684 1727204444.76802: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.76844: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.76905: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 41684 1727204444.76914: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.76919: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.76932: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.76976: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.77048: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.77078: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.77147: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py <<< 41684 1727204444.77154: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available <<< 41684 1727204444.77203: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.77254: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 41684 1727204444.77257: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.77414: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.77580: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available <<< 41684 1727204444.77625: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.77669: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 41684 1727204444.77683: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.77716: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.77769: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 41684 1727204444.77772: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.77835: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.77914: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py <<< 41684 1727204444.77917: stdout chunk (state=3): >>>import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available <<< 41684 1727204444.77992: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.78069: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py <<< 41684 1727204444.78083: stdout chunk (state=3): >>>import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 41684 1727204444.78148: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204444.78901: stdout chunk (state=3): >>>import 'gc' # <<< 41684 1727204444.79433: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 41684 1727204444.79467: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 41684 1727204444.79471: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 41684 1727204444.79513: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7502130> <<< 41684 1727204444.79516: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d37670> <<< 41684 1727204444.79580: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d75578e0> <<< 41684 1727204444.81469: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 41684 1727204444.81473: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 41684 1727204444.81498: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7557df0> <<< 41684 1727204444.81501: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 41684 1727204444.81539: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 41684 1727204444.81542: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d755a6a0> <<< 41684 1727204444.81612: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py <<< 41684 1727204444.81618: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204444.81652: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' <<< 41684 1727204444.81655: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d73605e0> <<< 41684 1727204444.81658: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7383190> <<< 41684 1727204444.81902: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 41684 1727204444.81915: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 41684 1727204445.06897: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7b<<< 41684 1727204445.06916: stdout chunk (state=3): >>>M1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_loadavg": {"1m": 0.42, "5m": 0.44, "15m": 0.27}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "44", "epoch": "1727204444", "epoch_int": "1727204444", "date": "2024-09-24", "time": "15:00:44", "iso8601_micro": "2024-09-24T19:00:44.798987Z", "iso8601": "2024-09-24T19:00:44Z", "iso8601_basic": "20240924T150044798987", "iso8601_basic_short": "20240924T150044", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2790, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 742, "free": 2790}, "nocache": {"free": 3265, "used": 267}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "remov<<< 41684 1727204445.06942: stdout chunk (state=3): >>>able": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 707, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271777792, "block_size": 4096, "block_total": 65519355, "block_available": 64519477, "block_used": 999878, "inode_total": 131071472, "inode_available": 130998230, "inode_used": 73242, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_fips": false, "ansible_local": {}, "ansible_interfaces": ["eth0", "rpltstbr", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_<<< 41684 1727204445.06958: stdout chunk (state=3): >>>fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41684 1727204445.07661: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback <<< 41684 1727204445.07670: stdout chunk (state=3): >>># clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal<<< 41684 1727204445.07673: stdout chunk (state=3): >>> # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat <<< 41684 1727204445.07678: stdout chunk (state=3): >>># cleanup[2] removing _collections_abc <<< 41684 1727204445.07682: stdout chunk (state=3): >>># cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os<<< 41684 1727204445.07685: stdout chunk (state=3): >>> # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants <<< 41684 1727204445.07690: stdout chunk (state=3): >>># cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword <<< 41684 1727204445.07695: stdout chunk (state=3): >>># cleanup[2] removing _operator <<< 41684 1727204445.07701: stdout chunk (state=3): >>># cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg<<< 41684 1727204445.07706: stdout chunk (state=3): >>> # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct <<< 41684 1727204445.07712: stdout chunk (state=3): >>># cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc <<< 41684 1727204445.07715: stdout chunk (state=3): >>># cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib<<< 41684 1727204445.07721: stdout chunk (state=3): >>> # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2<<< 41684 1727204445.07723: stdout chunk (state=3): >>> # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil <<< 41684 1727204445.07727: stdout chunk (state=3): >>># cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random <<< 41684 1727204445.07730: stdout chunk (state=3): >>># destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible <<< 41684 1727204445.07734: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ <<< 41684 1727204445.07736: stdout chunk (state=3): >>># cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder <<< 41684 1727204445.07818: stdout chunk (state=3): >>># cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale<<< 41684 1727204445.07831: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file <<< 41684 1727204445.07836: stdout chunk (state=3): >>># destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info<<< 41684 1727204445.07839: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction <<< 41684 1727204445.07843: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing <<< 41684 1727204445.07869: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform<<< 41684 1727204445.07896: stdout chunk (state=3): >>> # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd<<< 41684 1727204445.07925: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux <<< 41684 1727204445.07946: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts <<< 41684 1727204445.07975: stdout chunk (state=3): >>># destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline<<< 41684 1727204445.08002: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base <<< 41684 1727204445.08031: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd <<< 41684 1727204445.08049: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 41684 1727204445.08328: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 41684 1727204445.08345: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 41684 1727204445.08373: stdout chunk (state=3): >>># destroy zipimport <<< 41684 1727204445.08389: stdout chunk (state=3): >>># destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 41684 1727204445.08419: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 41684 1727204445.08433: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 41684 1727204445.08459: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 41684 1727204445.08499: stdout chunk (state=3): >>># destroy selinux <<< 41684 1727204445.08518: stdout chunk (state=3): >>># destroy distro # destroy logging # destroy argparse <<< 41684 1727204445.08540: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 41684 1727204445.08559: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 41684 1727204445.08578: stdout chunk (state=3): >>># destroy queue <<< 41684 1727204445.08596: stdout chunk (state=3): >>># destroy multiprocessing.reduction <<< 41684 1727204445.08619: stdout chunk (state=3): >>># destroy shlex # destroy datetime <<< 41684 1727204445.08639: stdout chunk (state=3): >>># destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass <<< 41684 1727204445.08655: stdout chunk (state=3): >>># destroy json <<< 41684 1727204445.08676: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 41684 1727204445.08696: stdout chunk (state=3): >>># destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 41684 1727204445.08708: stdout chunk (state=3): >>># destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 41684 1727204445.08846: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings <<< 41684 1727204445.08872: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg <<< 41684 1727204445.08894: stdout chunk (state=3): >>># cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq <<< 41684 1727204445.08914: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 41684 1727204445.08937: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal <<< 41684 1727204445.08963: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io <<< 41684 1727204445.08980: stdout chunk (state=3): >>># cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 41684 1727204445.09000: stdout chunk (state=3): >>># destroy unicodedata # destroy gc<<< 41684 1727204445.09019: stdout chunk (state=3): >>> # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 <<< 41684 1727204445.09035: stdout chunk (state=3): >>># destroy _lzma # destroy zlib # destroy _signal <<< 41684 1727204445.09278: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 41684 1727204445.09388: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 41684 1727204445.09688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204445.09698: stdout chunk (state=3): >>><<< 41684 1727204445.09711: stderr chunk (state=3): >>><<< 41684 1727204445.09872: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d89b3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d89583a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d89b3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d89b3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8958490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8958940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8958670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d890f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d890f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8932850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d890f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8970880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8908d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8932d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8958970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88aef10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88b40a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88a75b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88af6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88ae3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d8831e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8831910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8831f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8831fd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88420d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8889d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8882670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88956d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88b5e80> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d8842cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88892b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d88952e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88bba30> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8842eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8842df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8842d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d85723d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d85724c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d884af40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8844a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8844490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d848d220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d855d520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8844f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d88bb0a0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d849eb50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d849ee80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d84b0790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d84b0cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d843e400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d849ef70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d844e2e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d84b0610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d844e3a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8842a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d846a700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d846a9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d846a7c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d846a8b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d846ad00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d8476250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d846a940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d845da90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d8842610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d846aaf0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb5d83936d0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7dae820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7dae160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7dae280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7daef70> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7dae4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7daed90> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7daefd0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7dae100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d830d0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7c88340> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7c88040> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7c88ca0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d96dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d963a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d96fd0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7de3d30> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7db5d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7db5400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d61b20> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7db5520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7db5550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7cf6fd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7df6250> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7cf4850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7df63d0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7df6ca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7cf47f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7d8ec10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7df6fa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7df6550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7dee910> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7ce8940> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7d06d90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7cf2580> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7ce8ee0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7cf29a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7d367f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d04760> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d787a970> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d6c730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7db1370> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7d23550> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d76f6eb0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d297f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d28790> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d23b50> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7840370> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d785c580> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d785c4f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7830280> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7840970> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d75fb7f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d75fbb20> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d789e0a0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d783df70> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d789e190> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7663fd0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d788c820> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d75fbd60> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7549e80> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d75499d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d75c1490> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d755a850> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d75be670> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d75be220> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload__66pz3fr/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb5d7502130> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7d37670> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d75578e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7557df0> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d755a6a0> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d73605e0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb5d7383190> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_loadavg": {"1m": 0.42, "5m": 0.44, "15m": 0.27}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "44", "epoch": "1727204444", "epoch_int": "1727204444", "date": "2024-09-24", "time": "15:00:44", "iso8601_micro": "2024-09-24T19:00:44.798987Z", "iso8601": "2024-09-24T19:00:44Z", "iso8601_basic": "20240924T150044798987", "iso8601_basic_short": "20240924T150044", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2790, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 742, "free": 2790}, "nocache": {"free": 3265, "used": 267}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 707, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271777792, "block_size": 4096, "block_total": 65519355, "block_available": 64519477, "block_used": 999878, "inode_total": 131071472, "inode_available": 130998230, "inode_used": 73242, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_fips": false, "ansible_local": {}, "ansible_interfaces": ["eth0", "rpltstbr", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 41684 1727204445.12316: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204443.6466987-41921-219615563569101/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204445.12319: _low_level_execute_command(): starting 41684 1727204445.12322: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204443.6466987-41921-219615563569101/ > /dev/null 2>&1 && sleep 0' 41684 1727204445.13492: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204445.13497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204445.13543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204445.13547: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204445.13626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204445.13631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204445.13772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204445.13841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204445.13845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204445.13944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204445.15771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204445.15819: stderr chunk (state=3): >>><<< 41684 1727204445.15822: stdout chunk (state=3): >>><<< 41684 1727204445.15842: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204445.15850: handler run complete 41684 1727204445.15993: variable 'ansible_facts' from source: unknown 41684 1727204445.16120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204445.16996: variable 'ansible_facts' from source: unknown 41684 1727204445.17087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204445.17206: attempt loop complete, returning result 41684 1727204445.17209: _execute() done 41684 1727204445.17212: dumping result to json 41684 1727204445.17245: done dumping result, returning 41684 1727204445.17252: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [0affcd87-79f5-3839-086d-0000000000bf] 41684 1727204445.17258: sending task result for task 0affcd87-79f5-3839-086d-0000000000bf 41684 1727204445.17630: done sending task result for task 0affcd87-79f5-3839-086d-0000000000bf 41684 1727204445.17633: WORKER PROCESS EXITING ok: [managed-node1] 41684 1727204445.17967: no more pending results, returning what we have 41684 1727204445.17971: results queue empty 41684 1727204445.17972: checking for any_errors_fatal 41684 1727204445.17973: done checking for any_errors_fatal 41684 1727204445.17974: checking for max_fail_percentage 41684 1727204445.17976: done checking for max_fail_percentage 41684 1727204445.17976: checking to see if all hosts have failed and the running result is not ok 41684 1727204445.17977: done checking to see if all hosts have failed 41684 1727204445.17978: getting the remaining hosts for this loop 41684 1727204445.17980: done getting the remaining hosts for this loop 41684 1727204445.17985: getting the next task for host managed-node1 41684 1727204445.17992: done getting next task for host managed-node1 41684 1727204445.17994: ^ task is: TASK: meta (flush_handlers) 41684 1727204445.17996: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204445.18000: getting variables 41684 1727204445.18002: in VariableManager get_vars() 41684 1727204445.18025: Calling all_inventory to load vars for managed-node1 41684 1727204445.18028: Calling groups_inventory to load vars for managed-node1 41684 1727204445.18031: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204445.18043: Calling all_plugins_play to load vars for managed-node1 41684 1727204445.18046: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204445.18048: Calling groups_plugins_play to load vars for managed-node1 41684 1727204445.18230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204445.19048: done with get_vars() 41684 1727204445.19059: done getting variables 41684 1727204445.19126: in VariableManager get_vars() 41684 1727204445.19135: Calling all_inventory to load vars for managed-node1 41684 1727204445.19137: Calling groups_inventory to load vars for managed-node1 41684 1727204445.19140: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204445.19144: Calling all_plugins_play to load vars for managed-node1 41684 1727204445.19146: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204445.19149: Calling groups_plugins_play to load vars for managed-node1 41684 1727204445.20108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204445.20312: done with get_vars() 41684 1727204445.20328: done queuing things up, now waiting for results queue to drain 41684 1727204445.20330: results queue empty 41684 1727204445.20331: checking for any_errors_fatal 41684 1727204445.20334: done checking for any_errors_fatal 41684 1727204445.20335: checking for max_fail_percentage 41684 1727204445.20341: done checking for max_fail_percentage 41684 1727204445.20341: checking to see if all hosts have failed and the running result is not ok 41684 1727204445.20342: done checking to see if all hosts have failed 41684 1727204445.20343: getting the remaining hosts for this loop 41684 1727204445.20344: done getting the remaining hosts for this loop 41684 1727204445.20347: getting the next task for host managed-node1 41684 1727204445.20351: done getting next task for host managed-node1 41684 1727204445.20354: ^ task is: TASK: Include the task 'el_repo_setup.yml' 41684 1727204445.20355: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204445.20357: getting variables 41684 1727204445.20358: in VariableManager get_vars() 41684 1727204445.20371: Calling all_inventory to load vars for managed-node1 41684 1727204445.20373: Calling groups_inventory to load vars for managed-node1 41684 1727204445.20376: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204445.20380: Calling all_plugins_play to load vars for managed-node1 41684 1727204445.20383: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204445.20385: Calling groups_plugins_play to load vars for managed-node1 41684 1727204445.21469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204445.21681: done with get_vars() 41684 1727204445.21690: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:11 Tuesday 24 September 2024 15:00:45 -0400 (0:00:01.610) 0:00:01.627 ***** 41684 1727204445.22576: entering _queue_task() for managed-node1/include_tasks 41684 1727204445.22579: Creating lock for include_tasks 41684 1727204445.22891: worker is 1 (out of 1 available) 41684 1727204445.22902: exiting _queue_task() for managed-node1/include_tasks 41684 1727204445.22914: done queuing things up, now waiting for results queue to drain 41684 1727204445.22915: waiting for pending results... 41684 1727204445.23384: running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' 41684 1727204445.24161: in run() - task 0affcd87-79f5-3839-086d-000000000006 41684 1727204445.24185: variable 'ansible_search_path' from source: unknown 41684 1727204445.24225: calling self._execute() 41684 1727204445.24300: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204445.24311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204445.24325: variable 'omit' from source: magic vars 41684 1727204445.24435: _execute() done 41684 1727204445.24444: dumping result to json 41684 1727204445.24451: done dumping result, returning 41684 1727204445.24461: done running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' [0affcd87-79f5-3839-086d-000000000006] 41684 1727204445.24474: sending task result for task 0affcd87-79f5-3839-086d-000000000006 41684 1727204445.24615: no more pending results, returning what we have 41684 1727204445.24620: in VariableManager get_vars() 41684 1727204445.24651: Calling all_inventory to load vars for managed-node1 41684 1727204445.24654: Calling groups_inventory to load vars for managed-node1 41684 1727204445.24657: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204445.24676: Calling all_plugins_play to load vars for managed-node1 41684 1727204445.24678: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204445.24682: Calling groups_plugins_play to load vars for managed-node1 41684 1727204445.24845: done sending task result for task 0affcd87-79f5-3839-086d-000000000006 41684 1727204445.24848: WORKER PROCESS EXITING 41684 1727204445.24876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204445.25070: done with get_vars() 41684 1727204445.25078: variable 'ansible_search_path' from source: unknown 41684 1727204445.25092: we have included files to process 41684 1727204445.25093: generating all_blocks data 41684 1727204445.25094: done generating all_blocks data 41684 1727204445.25095: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 41684 1727204445.25096: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 41684 1727204445.25099: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 41684 1727204445.27096: in VariableManager get_vars() 41684 1727204445.27112: done with get_vars() 41684 1727204445.27124: done processing included file 41684 1727204445.27126: iterating over new_blocks loaded from include file 41684 1727204445.27127: in VariableManager get_vars() 41684 1727204445.27137: done with get_vars() 41684 1727204445.27138: filtering new block on tags 41684 1727204445.27152: done filtering new block on tags 41684 1727204445.27155: in VariableManager get_vars() 41684 1727204445.27571: done with get_vars() 41684 1727204445.27574: filtering new block on tags 41684 1727204445.27591: done filtering new block on tags 41684 1727204445.27593: in VariableManager get_vars() 41684 1727204445.27604: done with get_vars() 41684 1727204445.27605: filtering new block on tags 41684 1727204445.27617: done filtering new block on tags 41684 1727204445.27619: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node1 41684 1727204445.27626: extending task lists for all hosts with included blocks 41684 1727204445.27681: done extending task lists 41684 1727204445.27682: done processing included files 41684 1727204445.27683: results queue empty 41684 1727204445.27684: checking for any_errors_fatal 41684 1727204445.27686: done checking for any_errors_fatal 41684 1727204445.27686: checking for max_fail_percentage 41684 1727204445.27688: done checking for max_fail_percentage 41684 1727204445.27688: checking to see if all hosts have failed and the running result is not ok 41684 1727204445.27689: done checking to see if all hosts have failed 41684 1727204445.27690: getting the remaining hosts for this loop 41684 1727204445.27691: done getting the remaining hosts for this loop 41684 1727204445.27694: getting the next task for host managed-node1 41684 1727204445.27698: done getting next task for host managed-node1 41684 1727204445.27700: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 41684 1727204445.27702: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204445.27704: getting variables 41684 1727204445.27705: in VariableManager get_vars() 41684 1727204445.27713: Calling all_inventory to load vars for managed-node1 41684 1727204445.27715: Calling groups_inventory to load vars for managed-node1 41684 1727204445.27717: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204445.27723: Calling all_plugins_play to load vars for managed-node1 41684 1727204445.27725: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204445.27729: Calling groups_plugins_play to load vars for managed-node1 41684 1727204445.27903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204445.28928: done with get_vars() 41684 1727204445.28938: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 15:00:45 -0400 (0:00:00.064) 0:00:01.691 ***** 41684 1727204445.29012: entering _queue_task() for managed-node1/setup 41684 1727204445.30117: worker is 1 (out of 1 available) 41684 1727204445.30129: exiting _queue_task() for managed-node1/setup 41684 1727204445.30141: done queuing things up, now waiting for results queue to drain 41684 1727204445.30143: waiting for pending results... 41684 1727204445.30716: running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 41684 1727204445.30973: in run() - task 0affcd87-79f5-3839-086d-0000000000d0 41684 1727204445.31087: variable 'ansible_search_path' from source: unknown 41684 1727204445.31277: variable 'ansible_search_path' from source: unknown 41684 1727204445.31320: calling self._execute() 41684 1727204445.31395: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204445.31405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204445.31418: variable 'omit' from source: magic vars 41684 1727204445.32420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204445.37995: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204445.38136: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204445.38308: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204445.38346: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204445.38380: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204445.38461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204445.38699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204445.38732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204445.38782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204445.38803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204445.39126: variable 'ansible_facts' from source: unknown 41684 1727204445.39245: variable 'network_test_required_facts' from source: task vars 41684 1727204445.39428: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 41684 1727204445.39441: variable 'omit' from source: magic vars 41684 1727204445.39484: variable 'omit' from source: magic vars 41684 1727204445.39608: variable 'omit' from source: magic vars 41684 1727204445.39639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204445.39703: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204445.39859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204445.39885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204445.39899: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204445.39936: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204445.39945: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204445.40049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204445.40260: Set connection var ansible_connection to ssh 41684 1727204445.40277: Set connection var ansible_pipelining to False 41684 1727204445.40289: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204445.40300: Set connection var ansible_timeout to 10 41684 1727204445.40314: Set connection var ansible_shell_executable to /bin/sh 41684 1727204445.40322: Set connection var ansible_shell_type to sh 41684 1727204445.40355: variable 'ansible_shell_executable' from source: unknown 41684 1727204445.40368: variable 'ansible_connection' from source: unknown 41684 1727204445.40379: variable 'ansible_module_compression' from source: unknown 41684 1727204445.40387: variable 'ansible_shell_type' from source: unknown 41684 1727204445.40395: variable 'ansible_shell_executable' from source: unknown 41684 1727204445.40403: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204445.40411: variable 'ansible_pipelining' from source: unknown 41684 1727204445.40476: variable 'ansible_timeout' from source: unknown 41684 1727204445.40488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204445.40833: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204445.40849: variable 'omit' from source: magic vars 41684 1727204445.40858: starting attempt loop 41684 1727204445.40865: running the handler 41684 1727204445.40882: _low_level_execute_command(): starting 41684 1727204445.40892: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204445.42721: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204445.42738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204445.42752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204445.42778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204445.42911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204445.42923: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204445.42938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204445.42957: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204445.42972: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204445.42988: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204445.43000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204445.43014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204445.43029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204445.43041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204445.43052: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204445.43068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204445.43147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204445.43279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204445.43297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204445.43395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204445.44976: stdout chunk (state=3): >>>/root <<< 41684 1727204445.45171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204445.45175: stdout chunk (state=3): >>><<< 41684 1727204445.45177: stderr chunk (state=3): >>><<< 41684 1727204445.45292: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204445.45295: _low_level_execute_command(): starting 41684 1727204445.45298: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204445.451992-42145-243691883381375 `" && echo ansible-tmp-1727204445.451992-42145-243691883381375="` echo /root/.ansible/tmp/ansible-tmp-1727204445.451992-42145-243691883381375 `" ) && sleep 0' 41684 1727204445.46851: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204445.46869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204445.46886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204445.46905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204445.46952: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204445.46966: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204445.46982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204445.46999: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204445.47011: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204445.47026: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204445.47039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204445.47052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204445.47083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204445.47095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204445.47106: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204445.47120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204445.47205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204445.47372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204445.47390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204445.47482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204445.49356: stdout chunk (state=3): >>>ansible-tmp-1727204445.451992-42145-243691883381375=/root/.ansible/tmp/ansible-tmp-1727204445.451992-42145-243691883381375 <<< 41684 1727204445.49480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204445.49573: stderr chunk (state=3): >>><<< 41684 1727204445.49578: stdout chunk (state=3): >>><<< 41684 1727204445.49670: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204445.451992-42145-243691883381375=/root/.ansible/tmp/ansible-tmp-1727204445.451992-42145-243691883381375 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204445.49673: variable 'ansible_module_compression' from source: unknown 41684 1727204445.49872: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41684 1727204445.49875: variable 'ansible_facts' from source: unknown 41684 1727204445.49902: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204445.451992-42145-243691883381375/AnsiballZ_setup.py 41684 1727204445.50515: Sending initial data 41684 1727204445.50525: Sent initial data (153 bytes) 41684 1727204445.53104: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204445.53109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204445.53213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204445.53216: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204445.53269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204445.53272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204445.53453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204445.53456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204445.53458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204445.53527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204445.55230: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204445.55290: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204445.55347: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmprxuwmqu_ /root/.ansible/tmp/ansible-tmp-1727204445.451992-42145-243691883381375/AnsiballZ_setup.py <<< 41684 1727204445.55396: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204445.58690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204445.58796: stderr chunk (state=3): >>><<< 41684 1727204445.58801: stdout chunk (state=3): >>><<< 41684 1727204445.58824: done transferring module to remote 41684 1727204445.58838: _low_level_execute_command(): starting 41684 1727204445.58844: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204445.451992-42145-243691883381375/ /root/.ansible/tmp/ansible-tmp-1727204445.451992-42145-243691883381375/AnsiballZ_setup.py && sleep 0' 41684 1727204445.60395: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204445.60445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204445.60541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204445.60562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204445.60608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204445.60620: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204445.60636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204445.60658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204445.60673: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204445.60685: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204445.60697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204445.60711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204445.60727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204445.60741: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204445.60759: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204445.60777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204445.60930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204445.60947: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204445.60965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204445.61195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204445.62966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204445.62971: stdout chunk (state=3): >>><<< 41684 1727204445.62974: stderr chunk (state=3): >>><<< 41684 1727204445.63076: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204445.63080: _low_level_execute_command(): starting 41684 1727204445.63083: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204445.451992-42145-243691883381375/AnsiballZ_setup.py && sleep 0' 41684 1727204445.64517: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204445.64531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204445.64544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204445.64560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204445.64622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204445.64680: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204445.64698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204445.64723: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204445.64736: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204445.64748: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204445.64760: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204445.64778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204445.64794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204445.64809: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204445.64824: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204445.64837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204445.64970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204445.65052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204445.65073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204445.65174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204445.67115: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 41684 1727204445.67174: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 41684 1727204445.67220: stdout chunk (state=3): >>>import 'posix' # <<< 41684 1727204445.67242: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 41684 1727204445.67286: stdout chunk (state=3): >>>import 'time' # <<< 41684 1727204445.67289: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 41684 1727204445.67340: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204445.67385: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 41684 1727204445.67389: stdout chunk (state=3): >>>import '_codecs' # <<< 41684 1727204445.67400: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c151edc0> <<< 41684 1727204445.67453: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 41684 1727204445.67457: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c14c33a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c151eb20> <<< 41684 1727204445.67498: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 41684 1727204445.67510: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c151eac0> <<< 41684 1727204445.67547: stdout chunk (state=3): >>>import '_signal' # <<< 41684 1727204445.67551: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 41684 1727204445.67575: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c14c3490> <<< 41684 1727204445.67615: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 41684 1727204445.67635: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c14c3940> <<< 41684 1727204445.67648: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c14c3670> <<< 41684 1727204445.67684: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 41684 1727204445.67725: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 41684 1727204445.67736: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 41684 1727204445.67761: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 41684 1727204445.67776: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 41684 1727204445.67805: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c147a190> <<< 41684 1727204445.67819: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 41684 1727204445.67830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 41684 1727204445.67893: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c147a220> <<< 41684 1727204445.67922: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 41684 1727204445.67966: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' <<< 41684 1727204445.67977: stdout chunk (state=3): >>>import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c149d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c147a940> <<< 41684 1727204445.68007: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c14db880> <<< 41684 1727204445.68024: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c1473d90> <<< 41684 1727204445.68086: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 41684 1727204445.68089: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c149dd90> <<< 41684 1727204445.68140: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c14c3970> <<< 41684 1727204445.68172: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 41684 1727204445.68510: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 41684 1727204445.68545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 41684 1727204445.68569: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 41684 1727204445.68603: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 41684 1727204445.68623: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 41684 1727204445.68637: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11d4f10> <<< 41684 1727204445.68700: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11d90a0> <<< 41684 1727204445.68729: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 41684 1727204445.68742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # <<< 41684 1727204445.68758: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 41684 1727204445.68796: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 41684 1727204445.68824: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11cc5b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11d36a0> <<< 41684 1727204445.68854: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11d43d0> <<< 41684 1727204445.68857: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 41684 1727204445.68930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 41684 1727204445.68947: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 41684 1727204445.68987: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204445.68998: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 41684 1727204445.69036: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204445.69065: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c1090e80> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c1090970> import 'itertools' # <<< 41684 1727204445.69087: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c1090f70> <<< 41684 1727204445.69117: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 41684 1727204445.69146: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c1090dc0> <<< 41684 1727204445.69175: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a0130> import '_collections' # <<< 41684 1727204445.69223: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11aedf0> import '_functools' # <<< 41684 1727204445.69264: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11a76d0> <<< 41684 1727204445.69310: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11ba730> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11dae80> <<< 41684 1727204445.69336: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 41684 1727204445.69383: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c10a0d30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11ae310> <<< 41684 1727204445.69414: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c11ba340> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11e0a30> <<< 41684 1727204445.69440: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 41684 1727204445.69477: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204445.69513: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a0f10> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a0e50> <<< 41684 1727204445.69554: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a0dc0> <<< 41684 1727204445.69581: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 41684 1727204445.69614: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 41684 1727204445.69626: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 41684 1727204445.69678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 41684 1727204445.69712: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c1074430> <<< 41684 1727204445.69738: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 41684 1727204445.69770: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c1074520> <<< 41684 1727204445.69885: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a8fa0> <<< 41684 1727204445.69939: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a3af0> <<< 41684 1727204445.69960: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a34c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 41684 1727204445.69977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 41684 1727204445.70002: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 41684 1727204445.70036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 41684 1727204445.70052: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0fc2280> <<< 41684 1727204445.70082: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c105fdc0> <<< 41684 1727204445.70128: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a3f70> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11e00a0> <<< 41684 1727204445.70153: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 41684 1727204445.70191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 41684 1727204445.70203: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 41684 1727204445.70242: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0fd3bb0> import 'errno' # <<< 41684 1727204445.70275: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0fd3ee0> <<< 41684 1727204445.70294: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 41684 1727204445.70310: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 41684 1727204445.70325: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0fe57f0> <<< 41684 1727204445.70341: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 41684 1727204445.70370: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 41684 1727204445.70399: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0fe5d30> <<< 41684 1727204445.70430: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204445.70439: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0f7e460> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0fd3fd0> <<< 41684 1727204445.70463: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 41684 1727204445.70476: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 41684 1727204445.70525: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0f8e340> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0fe5670> <<< 41684 1727204445.70530: stdout chunk (state=3): >>>import 'pwd' # <<< 41684 1727204445.70558: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0f8e400> <<< 41684 1727204445.70617: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a0a90> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 41684 1727204445.70631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 41684 1727204445.70690: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 41684 1727204445.70699: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0faa760> <<< 41684 1727204445.70720: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 41684 1727204445.70750: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0faaa30> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0faa820> <<< 41684 1727204445.70778: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0faa910> <<< 41684 1727204445.70807: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 41684 1727204445.70995: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0faad60> <<< 41684 1727204445.71047: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0fb42b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0faa9a0> <<< 41684 1727204445.71054: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0f9eaf0> <<< 41684 1727204445.71076: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a0670> <<< 41684 1727204445.71099: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 41684 1727204445.71154: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 41684 1727204445.71190: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0faab50> <<< 41684 1727204445.71324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 41684 1727204445.71339: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f27c09e6730> <<< 41684 1727204445.71586: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip' # zipimport: zlib available <<< 41684 1727204445.71680: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.71726: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 41684 1727204445.71748: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 41684 1727204445.71763: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.72958: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.73886: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0923880> <<< 41684 1727204445.73920: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204445.73934: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 41684 1727204445.73946: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 41684 1727204445.73975: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0923160> <<< 41684 1727204445.74015: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0923280> <<< 41684 1727204445.74037: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0923fd0> <<< 41684 1727204445.74069: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 41684 1727204445.74122: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c09234f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0923df0> import 'atexit' # <<< 41684 1727204445.74154: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0923580> <<< 41684 1727204445.74176: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 41684 1727204445.74187: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 41684 1727204445.74253: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0923100> <<< 41684 1727204445.74258: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 41684 1727204445.74278: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 41684 1727204445.74312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 41684 1727204445.74328: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 41684 1727204445.74403: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c08b8070> <<< 41684 1727204445.74445: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c08003a0> <<< 41684 1727204445.74482: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c08000a0> <<< 41684 1727204445.74500: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 41684 1727204445.74549: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0800d00> <<< 41684 1727204445.74552: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c090bdc0> <<< 41684 1727204445.74729: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c090b3a0> <<< 41684 1727204445.74755: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 41684 1727204445.74794: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c090bf40> <<< 41684 1727204445.74797: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 41684 1727204445.74820: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 41684 1727204445.74852: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 41684 1727204445.74883: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 41684 1727204445.74904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 41684 1727204445.74907: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c095ae80> <<< 41684 1727204445.74976: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c08e1d90> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c08e1460> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0920ac0> <<< 41684 1727204445.75000: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c08e1580> <<< 41684 1727204445.75041: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c08e15b0> <<< 41684 1727204445.75060: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 41684 1727204445.75086: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 41684 1727204445.75115: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 41684 1727204445.75188: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c086bf70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c096c2b0> <<< 41684 1727204445.75209: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 41684 1727204445.75220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 41684 1727204445.75283: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204445.75287: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c08687f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c096c430> <<< 41684 1727204445.75296: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 41684 1727204445.75332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204445.75377: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 41684 1727204445.75381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 41684 1727204445.75428: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c096cc40> <<< 41684 1727204445.75555: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0868790> <<< 41684 1727204445.75643: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c096c100> <<< 41684 1727204445.75688: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c096c5b0> <<< 41684 1727204445.75720: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c096cf70> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0965970> <<< 41684 1727204445.75756: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 41684 1727204445.75782: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 41684 1727204445.75784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 41684 1727204445.75828: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c085e8e0> <<< 41684 1727204445.76004: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c087cdf0> <<< 41684 1727204445.76038: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0867520> <<< 41684 1727204445.76088: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c085ee80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0867940> # zipimport: zlib available # zipimport: zlib available <<< 41684 1727204445.76105: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 41684 1727204445.76108: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.76174: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.76255: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py <<< 41684 1727204445.76291: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.76312: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 41684 1727204445.76411: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.76507: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.76961: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.77437: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py <<< 41684 1727204445.77442: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 41684 1727204445.77486: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204445.77530: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0877790> <<< 41684 1727204445.77601: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c08b6850> <<< 41684 1727204445.77617: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0417fd0> <<< 41684 1727204445.77672: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 41684 1727204445.77696: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.77712: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.77715: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 41684 1727204445.77830: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.77958: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 41684 1727204445.77996: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c08e92e0> <<< 41684 1727204445.78000: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.78374: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.78740: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.78799: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.78860: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 41684 1727204445.78901: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.78941: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py <<< 41684 1727204445.78945: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.78994: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.79078: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/errors.py <<< 41684 1727204445.79113: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 41684 1727204445.79117: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.79147: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.79189: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 41684 1727204445.79382: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.79589: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 41684 1727204445.79603: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 41684 1727204445.79691: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0929ca0> # zipimport: zlib available <<< 41684 1727204445.79743: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.79833: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py <<< 41684 1727204445.79837: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 41684 1727204445.79855: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.79885: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.79932: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 41684 1727204445.79935: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.79961: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.80001: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.80092: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.80157: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 41684 1727204445.80181: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204445.80260: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c089bc40> <<< 41684 1727204445.80340: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0929be0> <<< 41684 1727204445.80397: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 41684 1727204445.80433: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.80486: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.80517: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.80554: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 41684 1727204445.80591: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 41684 1727204445.80594: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 41684 1727204445.80635: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 41684 1727204445.80638: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 41684 1727204445.80667: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 41684 1727204445.80738: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c08ad910> <<< 41684 1727204445.80786: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c08f7b50> <<< 41684 1727204445.80851: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0289820> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 41684 1727204445.80855: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.80889: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.80901: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 41684 1727204445.80990: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 41684 1727204445.81019: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41684 1727204445.81022: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 41684 1727204445.81073: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.81126: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.81144: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.81160: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.81197: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.81236: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.81291: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.81304: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available <<< 41684 1727204445.81367: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.81437: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.81450: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.81488: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 41684 1727204445.81639: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.81777: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.81811: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.81870: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204445.81911: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 41684 1727204445.81936: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 41684 1727204445.81969: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0174100> <<< 41684 1727204445.81997: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 41684 1727204445.82000: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 41684 1727204445.82039: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 41684 1727204445.82076: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 41684 1727204445.82084: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c03d8a90> <<< 41684 1727204445.82109: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c03d8a00> <<< 41684 1727204445.82201: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c03abdc0> <<< 41684 1727204445.82204: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c03ab790> <<< 41684 1727204445.82241: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c03f74c0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c03f7d60> <<< 41684 1727204445.82245: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 41684 1727204445.82288: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 41684 1727204445.82301: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 41684 1727204445.82338: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204445.82351: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c03bbee0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c03bb9d0> <<< 41684 1727204445.82407: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc'<<< 41684 1727204445.82453: stdout chunk (state=3): >>> import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c03bb1f0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 41684 1727204445.82472: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 41684 1727204445.82494: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c01d6280> <<< 41684 1727204445.82541: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0975a30> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c03f7070> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 41684 1727204445.82602: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available <<< 41684 1727204445.82623: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available <<< 41684 1727204445.82644: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.82698: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 41684 1727204445.82748: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.82800: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 41684 1727204445.82843: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 41684 1727204445.82895: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.82909: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 41684 1727204445.82948: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.82992: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available <<< 41684 1727204445.83130: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available <<< 41684 1727204445.83134: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.83185: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.83221: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.83287: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 41684 1727204445.83298: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.83677: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.84028: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 41684 1727204445.84085: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.84129: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.84179: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.84200: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available <<< 41684 1727204445.84236: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.84286: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 41684 1727204445.84293: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.84312: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.84371: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 41684 1727204445.84412: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.84415: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.84438: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 41684 1727204445.84460: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.84507: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available <<< 41684 1727204445.84570: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.84638: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 41684 1727204445.84660: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c00c7ee0> <<< 41684 1727204445.84681: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 41684 1727204445.84711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 41684 1727204445.84870: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c00c79d0> <<< 41684 1727204445.84879: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 41684 1727204445.84931: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.84994: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 41684 1727204445.84997: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.85069: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.85152: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 41684 1727204445.85210: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.85294: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 41684 1727204445.85297: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.85319: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.85374: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 41684 1727204445.85392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 41684 1727204445.85523: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c00ef040> <<< 41684 1727204445.85766: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c03f78e0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 41684 1727204445.85814: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.85869: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 41684 1727204445.85873: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.85946: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.86014: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.86110: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.86243: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/compat/version.py <<< 41684 1727204445.86249: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 41684 1727204445.86291: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.86326: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 41684 1727204445.86333: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.86364: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.86412: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 41684 1727204445.86459: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c013adf0> <<< 41684 1727204445.86472: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c013a580> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available <<< 41684 1727204445.86489: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 41684 1727204445.86509: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.86541: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.86593: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 41684 1727204445.86722: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.86854: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 41684 1727204445.86861: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.86935: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.87021: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.87054: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.87103: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 41684 1727204445.87110: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.87181: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.87197: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.87309: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.87435: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 41684 1727204445.87542: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.87651: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available <<< 41684 1727204445.87687: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.87716: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.88147: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.88575: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available <<< 41684 1727204445.88664: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.88771: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 41684 1727204445.88971: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.88975: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 41684 1727204445.89069: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.89200: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 41684 1727204445.89215: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 41684 1727204445.89232: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.89270: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.89313: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available <<< 41684 1727204445.89401: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.89481: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.89659: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.89970: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 41684 1727204445.89973: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.89975: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.89977: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 41684 1727204445.89979: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.89981: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.89983: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 41684 1727204445.90016: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.90087: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 41684 1727204445.90098: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.90105: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.90131: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 41684 1727204445.90138: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.90192: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.90244: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 41684 1727204445.90251: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.90296: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.90350: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 41684 1727204445.90356: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.90578: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.90799: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 41684 1727204445.90806: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.90844: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.90903: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 41684 1727204445.90945: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.90963: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 41684 1727204445.90982: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.91018: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.91062: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available <<< 41684 1727204445.91079: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.91118: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available <<< 41684 1727204445.91182: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.91292: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 41684 1727204445.91298: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.91341: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.91402: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available <<< 41684 1727204445.91412: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.91419: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.91488: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.91506: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.91568: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.91629: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 41684 1727204445.91646: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.91687: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.91740: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 41684 1727204445.91903: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.92075: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 41684 1727204445.92083: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.92114: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.92159: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available <<< 41684 1727204445.92205: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.92248: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 41684 1727204445.92255: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.92317: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.92404: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 41684 1727204445.92411: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.92476: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.92550: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 41684 1727204445.92621: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204445.92821: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py <<< 41684 1727204445.92844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 41684 1727204445.92866: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 41684 1727204445.92897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 41684 1727204445.92910: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27bfeeabb0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27bfead2e0> <<< 41684 1727204445.92969: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27bfeadcd0> <<< 41684 1727204445.93786: stdout chunk (state=3): >>>import 'gc' # <<< 41684 1727204445.94193: stdout chunk (state=3): >>> <<< 41684 1727204445.94226: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "45", "epoch": "1727204445", "epoch_int": "1727204445", "date": "2024-09-24", "time": "15:00:45", "iso8601_micro": "2024-09-24T19:00:45.927356Z", "iso8601": "2024-09-24T19:00:45Z", "iso8601_basic": "20240924T150045927356", "iso8601_basic_short": "20240924T150045", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redh<<< 41684 1727204445.94246: stdout chunk (state=3): >>>at.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41684 1727204445.94649: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 41684 1727204445.94726: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq<<< 41684 1727204445.94768: stdout chunk (state=3): >>> # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json <<< 41684 1727204445.94806: stdout chunk (state=3): >>># cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader <<< 41684 1727204445.94903: stdout chunk (state=3): >>># cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 <<< 41684 1727204445.94955: stdout chunk (state=3): >>># destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector <<< 41684 1727204445.94974: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline <<< 41684 1727204445.94995: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys <<< 41684 1727204445.95002: stdout chunk (state=3): >>># cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux <<< 41684 1727204445.95026: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other <<< 41684 1727204445.95055: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc <<< 41684 1727204445.95316: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 41684 1727204445.95335: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 41684 1727204445.95371: stdout chunk (state=3): >>># destroy zipimport <<< 41684 1727204445.95397: stdout chunk (state=3): >>># destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 41684 1727204445.95421: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 41684 1727204445.95440: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 41684 1727204445.95479: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 41684 1727204445.95517: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 41684 1727204445.95546: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 41684 1727204445.95568: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex <<< 41684 1727204445.95589: stdout chunk (state=3): >>># destroy datetime # destroy base64 <<< 41684 1727204445.95608: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass <<< 41684 1727204445.95635: stdout chunk (state=3): >>># destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 41684 1727204445.95695: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios <<< 41684 1727204445.95784: stdout chunk (state=3): >>># cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil <<< 41684 1727204445.95805: stdout chunk (state=3): >>># destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 41684 1727204445.95860: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools <<< 41684 1727204445.95902: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 41684 1727204445.95930: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 41684 1727204445.95955: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 41684 1727204445.96011: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 41684 1727204445.96176: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 41684 1727204445.96217: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath <<< 41684 1727204445.96232: stdout chunk (state=3): >>># destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select <<< 41684 1727204445.96258: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 41684 1727204445.96295: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 41684 1727204445.96632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204445.96637: stderr chunk (state=3): >>><<< 41684 1727204445.96640: stdout chunk (state=3): >>><<< 41684 1727204445.96786: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c151edc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c14c33a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c151eb20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c151eac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c14c3490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c14c3940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c14c3670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c147a190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c147a220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c149d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c147a940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c14db880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c1473d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c149dd90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c14c3970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11d4f10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11d90a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11cc5b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11d36a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11d43d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c1090e80> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c1090970> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c1090f70> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c1090dc0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a0130> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11aedf0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11a76d0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11ba730> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11dae80> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c10a0d30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11ae310> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c11ba340> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11e0a30> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a0f10> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a0e50> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a0dc0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c1074430> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c1074520> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a8fa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a3af0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a34c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0fc2280> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c105fdc0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a3f70> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c11e00a0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0fd3bb0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0fd3ee0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0fe57f0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0fe5d30> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0f7e460> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0fd3fd0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0f8e340> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0fe5670> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0f8e400> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a0a90> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0faa760> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0faaa30> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0faa820> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0faa910> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0faad60> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0fb42b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0faa9a0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0f9eaf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c10a0670> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0faab50> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f27c09e6730> # zipimport: found 103 names in '/tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0923880> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0923160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0923280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0923fd0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c09234f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0923df0> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0923580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0923100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c08b8070> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c08003a0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c08000a0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0800d00> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c090bdc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c090b3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c090bf40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c095ae80> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c08e1d90> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c08e1460> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0920ac0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c08e1580> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c08e15b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c086bf70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c096c2b0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c08687f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c096c430> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c096cc40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0868790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c096c100> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c096c5b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c096cf70> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0965970> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c085e8e0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c087cdf0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0867520> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c085ee80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0867940> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c0877790> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c08b6850> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0417fd0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c08e92e0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0929ca0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c089bc40> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0929be0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c08ad910> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c08f7b50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0289820> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0174100> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c03d8a90> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c03d8a00> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c03abdc0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c03ab790> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c03f74c0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c03f7d60> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c03bbee0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c03bb9d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c03bb1f0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c01d6280> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c0975a30> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c03f7070> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c00c7ee0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c00c79d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c00ef040> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c03f78e0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27c013adf0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27c013a580> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_jtlx47sb/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f27bfeeabb0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27bfead2e0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f27bfeadcd0> import 'gc' # {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "45", "epoch": "1727204445", "epoch_int": "1727204445", "date": "2024-09-24", "time": "15:00:45", "iso8601_micro": "2024-09-24T19:00:45.927356Z", "iso8601": "2024-09-24T19:00:45Z", "iso8601_basic": "20240924T150045927356", "iso8601_basic_short": "20240924T150045", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 41684 1727204445.97919: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204445.451992-42145-243691883381375/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204445.97928: _low_level_execute_command(): starting 41684 1727204445.97931: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204445.451992-42145-243691883381375/ > /dev/null 2>&1 && sleep 0' 41684 1727204445.98835: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204445.98843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204445.98885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204445.98892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204445.98956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204445.98966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204445.99097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204445.99198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204445.99301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204446.01133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204446.01137: stdout chunk (state=3): >>><<< 41684 1727204446.01139: stderr chunk (state=3): >>><<< 41684 1727204446.01580: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204446.01587: handler run complete 41684 1727204446.01590: variable 'ansible_facts' from source: unknown 41684 1727204446.01592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204446.01594: variable 'ansible_facts' from source: unknown 41684 1727204446.01596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204446.01598: attempt loop complete, returning result 41684 1727204446.01600: _execute() done 41684 1727204446.01602: dumping result to json 41684 1727204446.01604: done dumping result, returning 41684 1727204446.01606: done running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcd87-79f5-3839-086d-0000000000d0] 41684 1727204446.01608: sending task result for task 0affcd87-79f5-3839-086d-0000000000d0 ok: [managed-node1] 41684 1727204446.01868: no more pending results, returning what we have 41684 1727204446.01872: results queue empty 41684 1727204446.01873: checking for any_errors_fatal 41684 1727204446.01874: done checking for any_errors_fatal 41684 1727204446.01875: checking for max_fail_percentage 41684 1727204446.01877: done checking for max_fail_percentage 41684 1727204446.01878: checking to see if all hosts have failed and the running result is not ok 41684 1727204446.01878: done checking to see if all hosts have failed 41684 1727204446.01879: getting the remaining hosts for this loop 41684 1727204446.01881: done getting the remaining hosts for this loop 41684 1727204446.01885: getting the next task for host managed-node1 41684 1727204446.01895: done getting next task for host managed-node1 41684 1727204446.01898: ^ task is: TASK: Check if system is ostree 41684 1727204446.01900: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204446.01906: getting variables 41684 1727204446.01908: in VariableManager get_vars() 41684 1727204446.01941: Calling all_inventory to load vars for managed-node1 41684 1727204446.01944: Calling groups_inventory to load vars for managed-node1 41684 1727204446.01947: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204446.01958: Calling all_plugins_play to load vars for managed-node1 41684 1727204446.01961: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204446.01965: Calling groups_plugins_play to load vars for managed-node1 41684 1727204446.02173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204446.02647: done with get_vars() 41684 1727204446.02663: done getting variables 41684 1727204446.02742: done sending task result for task 0affcd87-79f5-3839-086d-0000000000d0 41684 1727204446.02746: WORKER PROCESS EXITING TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 15:00:46 -0400 (0:00:00.741) 0:00:02.432 ***** 41684 1727204446.03118: entering _queue_task() for managed-node1/stat 41684 1727204446.04449: worker is 1 (out of 1 available) 41684 1727204446.04466: exiting _queue_task() for managed-node1/stat 41684 1727204446.04478: done queuing things up, now waiting for results queue to drain 41684 1727204446.04480: waiting for pending results... 41684 1727204446.04727: running TaskExecutor() for managed-node1/TASK: Check if system is ostree 41684 1727204446.04825: in run() - task 0affcd87-79f5-3839-086d-0000000000d2 41684 1727204446.04838: variable 'ansible_search_path' from source: unknown 41684 1727204446.04843: variable 'ansible_search_path' from source: unknown 41684 1727204446.04890: calling self._execute() 41684 1727204446.04966: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204446.04971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204446.04981: variable 'omit' from source: magic vars 41684 1727204446.05547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204446.05808: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204446.05853: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204446.05892: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204446.06472: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204446.06554: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204446.06586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204446.06626: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204446.06719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204446.06898: Evaluated conditional (not __network_is_ostree is defined): True 41684 1727204446.07024: variable 'omit' from source: magic vars 41684 1727204446.07128: variable 'omit' from source: magic vars 41684 1727204446.07291: variable 'omit' from source: magic vars 41684 1727204446.07320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204446.07397: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204446.07474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204446.07494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204446.07574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204446.07608: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204446.07616: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204446.07624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204446.07838: Set connection var ansible_connection to ssh 41684 1727204446.07896: Set connection var ansible_pipelining to False 41684 1727204446.07929: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204446.07985: Set connection var ansible_timeout to 10 41684 1727204446.08008: Set connection var ansible_shell_executable to /bin/sh 41684 1727204446.08071: Set connection var ansible_shell_type to sh 41684 1727204446.08099: variable 'ansible_shell_executable' from source: unknown 41684 1727204446.08205: variable 'ansible_connection' from source: unknown 41684 1727204446.08248: variable 'ansible_module_compression' from source: unknown 41684 1727204446.08296: variable 'ansible_shell_type' from source: unknown 41684 1727204446.08327: variable 'ansible_shell_executable' from source: unknown 41684 1727204446.08335: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204446.08342: variable 'ansible_pipelining' from source: unknown 41684 1727204446.08349: variable 'ansible_timeout' from source: unknown 41684 1727204446.08356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204446.08696: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204446.08709: variable 'omit' from source: magic vars 41684 1727204446.08716: starting attempt loop 41684 1727204446.08721: running the handler 41684 1727204446.08737: _low_level_execute_command(): starting 41684 1727204446.08747: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204446.09529: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204446.09544: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204446.09557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204446.09577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204446.09615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204446.09627: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204446.09645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204446.09669: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204446.09684: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204446.09699: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204446.09713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204446.09727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204446.09743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204446.09758: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204446.09774: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204446.09788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204446.09864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204446.09892: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204446.09910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204446.10004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204446.11558: stdout chunk (state=3): >>>/root <<< 41684 1727204446.11768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204446.11773: stdout chunk (state=3): >>><<< 41684 1727204446.11777: stderr chunk (state=3): >>><<< 41684 1727204446.11896: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204446.11909: _low_level_execute_command(): starting 41684 1727204446.11912: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204446.118009-42174-242621941361361 `" && echo ansible-tmp-1727204446.118009-42174-242621941361361="` echo /root/.ansible/tmp/ansible-tmp-1727204446.118009-42174-242621941361361 `" ) && sleep 0' 41684 1727204446.13081: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204446.13086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204446.13128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204446.13132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204446.13134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204446.13203: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204446.13207: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204446.13209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204446.13285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41684 1727204446.15581: stdout chunk (state=3): >>>ansible-tmp-1727204446.118009-42174-242621941361361=/root/.ansible/tmp/ansible-tmp-1727204446.118009-42174-242621941361361 <<< 41684 1727204446.15792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204446.15796: stdout chunk (state=3): >>><<< 41684 1727204446.15799: stderr chunk (state=3): >>><<< 41684 1727204446.15870: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204446.118009-42174-242621941361361=/root/.ansible/tmp/ansible-tmp-1727204446.118009-42174-242621941361361 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41684 1727204446.15980: variable 'ansible_module_compression' from source: unknown 41684 1727204446.15983: ANSIBALLZ: Using lock for stat 41684 1727204446.15986: ANSIBALLZ: Acquiring lock 41684 1727204446.15988: ANSIBALLZ: Lock acquired: 139842516809008 41684 1727204446.15990: ANSIBALLZ: Creating module 41684 1727204446.31636: ANSIBALLZ: Writing module into payload 41684 1727204446.31782: ANSIBALLZ: Writing module 41684 1727204446.31811: ANSIBALLZ: Renaming module 41684 1727204446.31821: ANSIBALLZ: Done creating module 41684 1727204446.31842: variable 'ansible_facts' from source: unknown 41684 1727204446.31911: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204446.118009-42174-242621941361361/AnsiballZ_stat.py 41684 1727204446.32081: Sending initial data 41684 1727204446.32084: Sent initial data (152 bytes) 41684 1727204446.33136: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204446.33152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204446.33168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204446.33199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204446.33244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204446.33256: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204446.33272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204446.33295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204446.33313: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204446.33325: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204446.33336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204446.33349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204446.33365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204446.33378: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204446.33389: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204446.33409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204446.33491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204446.33508: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204446.33532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204446.33619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204446.35346: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204446.35397: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204446.35458: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmp37l3vxuk /root/.ansible/tmp/ansible-tmp-1727204446.118009-42174-242621941361361/AnsiballZ_stat.py <<< 41684 1727204446.35503: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204446.36804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204446.36937: stderr chunk (state=3): >>><<< 41684 1727204446.36940: stdout chunk (state=3): >>><<< 41684 1727204446.36947: done transferring module to remote 41684 1727204446.36949: _low_level_execute_command(): starting 41684 1727204446.36952: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204446.118009-42174-242621941361361/ /root/.ansible/tmp/ansible-tmp-1727204446.118009-42174-242621941361361/AnsiballZ_stat.py && sleep 0' 41684 1727204446.37823: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204446.37944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204446.38054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204446.38079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204446.38124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204446.38136: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204446.38155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204446.38179: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204446.38191: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204446.38203: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204446.38215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204446.38229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204446.38244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204446.38260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204446.38278: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204446.38293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204446.38377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204446.38499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204446.38515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204446.38712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204446.40480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204446.40484: stdout chunk (state=3): >>><<< 41684 1727204446.40487: stderr chunk (state=3): >>><<< 41684 1727204446.40582: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204446.40586: _low_level_execute_command(): starting 41684 1727204446.40588: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204446.118009-42174-242621941361361/AnsiballZ_stat.py && sleep 0' 41684 1727204446.41155: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204446.41176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204446.41190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204446.41207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204446.41251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204446.41269: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204446.41285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204446.41302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204446.41312: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204446.41322: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204446.41332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204446.41344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204446.41360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204446.41377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204446.41388: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204446.41400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204446.41480: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204446.41496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204446.41511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204446.41610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204446.44049: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 41684 1727204446.44085: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 41684 1727204446.44089: stdout chunk (state=3): >>> import '_weakref' # <<< 41684 1727204446.44179: stdout chunk (state=3): >>>import '_io' # <<< 41684 1727204446.44189: stdout chunk (state=3): >>>import 'marshal' # <<< 41684 1727204446.44244: stdout chunk (state=3): >>>import 'posix' # <<< 41684 1727204446.44285: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 41684 1727204446.44289: stdout chunk (state=3): >>># installing zipimport hook <<< 41684 1727204446.44348: stdout chunk (state=3): >>>import 'time' # <<< 41684 1727204446.44363: stdout chunk (state=3): >>>import 'zipimport' # <<< 41684 1727204446.44371: stdout chunk (state=3): >>># installed zipimport hook <<< 41684 1727204446.44444: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py <<< 41684 1727204446.44453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204446.44485: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 41684 1727204446.44519: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 41684 1727204446.44531: stdout chunk (state=3): >>>import '_codecs' # <<< 41684 1727204446.44573: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c698dc0> <<< 41684 1727204446.44620: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 41684 1727204446.44650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 41684 1727204446.44655: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c63d3a0> <<< 41684 1727204446.44672: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c698b20> <<< 41684 1727204446.44702: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py <<< 41684 1727204446.44718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 41684 1727204446.44745: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c698ac0> <<< 41684 1727204446.44779: stdout chunk (state=3): >>>import '_signal' # <<< 41684 1727204446.44797: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 41684 1727204446.44817: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c63d490> <<< 41684 1727204446.44846: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py <<< 41684 1727204446.44858: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 41684 1727204446.44879: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py <<< 41684 1727204446.44882: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 41684 1727204446.44894: stdout chunk (state=3): >>>import '_abc' # <<< 41684 1727204446.44899: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c63d940> <<< 41684 1727204446.44921: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c63d670> <<< 41684 1727204446.44974: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 41684 1727204446.44984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 41684 1727204446.44991: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 41684 1727204446.45026: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 41684 1727204446.45049: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 41684 1727204446.45079: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 41684 1727204446.45101: stdout chunk (state=3): >>>import '_stat' # <<< 41684 1727204446.45105: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3cf190> <<< 41684 1727204446.45128: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 41684 1727204446.45149: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 41684 1727204446.45268: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3cf220> <<< 41684 1727204446.45298: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 41684 1727204446.45332: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py <<< 41684 1727204446.45335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3cf940> <<< 41684 1727204446.45387: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c655880> <<< 41684 1727204446.45430: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3c8d90> <<< 41684 1727204446.45489: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3f2d90> <<< 41684 1727204446.45543: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c63d970> <<< 41684 1727204446.45561: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 41684 1727204446.45765: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 41684 1727204446.45799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 41684 1727204446.45823: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 41684 1727204446.45859: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 41684 1727204446.45883: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c393f10> <<< 41684 1727204446.45927: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3990a0> <<< 41684 1727204446.45952: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 41684 1727204446.46007: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 41684 1727204446.46037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 41684 1727204446.46057: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c38c5b0> <<< 41684 1727204446.46087: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3946a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3933d0> <<< 41684 1727204446.46108: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 41684 1727204446.46179: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 41684 1727204446.46203: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 41684 1727204446.46225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204446.46259: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 41684 1727204446.46368: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c316e80> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c316970> import 'itertools' # <<< 41684 1727204446.46692: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c316f70> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 41684 1727204446.46695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c316dc0> <<< 41684 1727204446.46698: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c326130> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c36edf0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3676d0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c37a730> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c39ae80> <<< 41684 1727204446.46701: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 41684 1727204446.46703: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c326d30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c36e310> <<< 41684 1727204446.46706: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c37a340> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3a0a30> <<< 41684 1727204446.46759: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204446.46779: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 41684 1727204446.47162: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c326f10> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c326e50> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c326dc0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c2fa430> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c2fa520> <<< 41684 1727204446.47296: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c32ffa0> <<< 41684 1727204446.47323: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c329af0> <<< 41684 1727204446.47349: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3294c0> <<< 41684 1727204446.47391: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 41684 1727204446.47413: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 41684 1727204446.47473: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 41684 1727204446.47500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 41684 1727204446.47533: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 41684 1727204446.47557: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 41684 1727204446.47583: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c22f280> <<< 41684 1727204446.47643: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c2e5dc0> <<< 41684 1727204446.47744: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c329f70> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3a00a0> <<< 41684 1727204446.47781: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 41684 1727204446.47840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 41684 1727204446.47887: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 41684 1727204446.47912: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c23fbb0> <<< 41684 1727204446.47923: stdout chunk (state=3): >>>import 'errno' # <<< 41684 1727204446.47992: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c23fee0> <<< 41684 1727204446.48028: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 41684 1727204446.48059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 41684 1727204446.48105: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 41684 1727204446.48130: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c2517f0> <<< 41684 1727204446.48178: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 41684 1727204446.48231: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 41684 1727204446.48288: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c251d30> <<< 41684 1727204446.48371: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c1df460> <<< 41684 1727204446.48428: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c23ffd0> <<< 41684 1727204446.48432: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py<<< 41684 1727204446.48438: stdout chunk (state=3): >>> <<< 41684 1727204446.48441: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc'<<< 41684 1727204446.48444: stdout chunk (state=3): >>> <<< 41684 1727204446.48495: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so'<<< 41684 1727204446.48516: stdout chunk (state=3): >>> <<< 41684 1727204446.48533: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c1ef340> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c251670><<< 41684 1727204446.48548: stdout chunk (state=3): >>> <<< 41684 1727204446.48561: stdout chunk (state=3): >>>import 'pwd' # <<< 41684 1727204446.48600: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so'<<< 41684 1727204446.48620: stdout chunk (state=3): >>> # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c1ef400><<< 41684 1727204446.48630: stdout chunk (state=3): >>> <<< 41684 1727204446.48689: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c326a90><<< 41684 1727204446.48697: stdout chunk (state=3): >>> <<< 41684 1727204446.48720: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py<<< 41684 1727204446.48730: stdout chunk (state=3): >>> <<< 41684 1727204446.48759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc'<<< 41684 1727204446.48768: stdout chunk (state=3): >>> <<< 41684 1727204446.48800: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 41684 1727204446.48826: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 41684 1727204446.48899: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so'<<< 41684 1727204446.48907: stdout chunk (state=3): >>> import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c20b760> <<< 41684 1727204446.48938: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py<<< 41684 1727204446.48951: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc'<<< 41684 1727204446.48958: stdout chunk (state=3): >>> <<< 41684 1727204446.48998: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so'<<< 41684 1727204446.49016: stdout chunk (state=3): >>> # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c20ba30><<< 41684 1727204446.49032: stdout chunk (state=3): >>> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c20b820><<< 41684 1727204446.49039: stdout chunk (state=3): >>> <<< 41684 1727204446.49086: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so'<<< 41684 1727204446.49101: stdout chunk (state=3): >>> # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c20b910><<< 41684 1727204446.49109: stdout chunk (state=3): >>> <<< 41684 1727204446.49150: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py<<< 41684 1727204446.49172: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 41684 1727204446.49455: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so'<<< 41684 1727204446.49475: stdout chunk (state=3): >>> # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c20bd60><<< 41684 1727204446.49483: stdout chunk (state=3): >>> <<< 41684 1727204446.49532: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so'<<< 41684 1727204446.49550: stdout chunk (state=3): >>> # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c2162b0><<< 41684 1727204446.49568: stdout chunk (state=3): >>> <<< 41684 1727204446.49580: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c20b9a0> <<< 41684 1727204446.49608: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c1ffaf0><<< 41684 1727204446.49617: stdout chunk (state=3): >>> <<< 41684 1727204446.49652: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c326670><<< 41684 1727204446.49660: stdout chunk (state=3): >>> <<< 41684 1727204446.49692: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py<<< 41684 1727204446.49700: stdout chunk (state=3): >>> <<< 41684 1727204446.49782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc'<<< 41684 1727204446.49789: stdout chunk (state=3): >>> <<< 41684 1727204446.49836: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c20bb50><<< 41684 1727204446.49843: stdout chunk (state=3): >>> <<< 41684 1727204446.49981: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 41684 1727204446.50012: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f502bbe6730><<< 41684 1727204446.50017: stdout chunk (state=3): >>> <<< 41684 1727204446.50281: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip' # zipimport: zlib available <<< 41684 1727204446.50378: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.50422: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 41684 1727204446.50463: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.50485: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 41684 1727204446.50490: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.52407: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.53832: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py <<< 41684 1727204446.53837: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb0d880> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py <<< 41684 1727204446.53840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204446.53883: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 41684 1727204446.53887: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 41684 1727204446.53919: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502bb0d160> <<< 41684 1727204446.53979: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb0d280> <<< 41684 1727204446.53984: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb0dfd0> <<< 41684 1727204446.54010: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 41684 1727204446.54062: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb0d4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb0ddf0> <<< 41684 1727204446.54078: stdout chunk (state=3): >>>import 'atexit' # <<< 41684 1727204446.54102: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204446.54106: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502bb0d580> <<< 41684 1727204446.54118: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 41684 1727204446.54143: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 41684 1727204446.54188: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb0d100> <<< 41684 1727204446.54203: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 41684 1727204446.54220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 41684 1727204446.54231: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 41684 1727204446.54261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 41684 1727204446.54286: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 41684 1727204446.54289: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 41684 1727204446.54350: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502ba64fa0> <<< 41684 1727204446.54387: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502ba82c70> <<< 41684 1727204446.54429: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502ba82f70> <<< 41684 1727204446.54442: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 41684 1727204446.54466: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 41684 1727204446.54507: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502ba82310> <<< 41684 1727204446.54510: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb75dc0> <<< 41684 1727204446.54698: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb753a0> <<< 41684 1727204446.54703: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 41684 1727204446.54706: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 41684 1727204446.54724: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb75f40> <<< 41684 1727204446.54745: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 41684 1727204446.54753: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 41684 1727204446.54780: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 41684 1727204446.54799: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 41684 1727204446.54805: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 41684 1727204446.54827: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 41684 1727204446.54838: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb44e80> <<< 41684 1727204446.54925: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bae0d90> <<< 41684 1727204446.54931: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bae0460> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb17550> <<< 41684 1727204446.54971: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204446.54977: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502bae0580> <<< 41684 1727204446.54996: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bae05b0> <<< 41684 1727204446.55016: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 41684 1727204446.55030: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 41684 1727204446.55041: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 41684 1727204446.55076: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 41684 1727204446.55146: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204446.55152: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502ba55f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb562b0> <<< 41684 1727204446.55188: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 41684 1727204446.55191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc'<<< 41684 1727204446.55194: stdout chunk (state=3): >>> <<< 41684 1727204446.55253: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502ba527f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb56430> <<< 41684 1727204446.55273: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 41684 1727204446.55298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204446.55330: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 41684 1727204446.55334: stdout chunk (state=3): >>>import '_string' # <<< 41684 1727204446.55389: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb6de80> <<< 41684 1727204446.55517: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502ba52790> <<< 41684 1727204446.56020: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502ba525e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502ba51550> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502ba51490> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb4e970> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502bad66a0> <<< 41684 1727204446.56105: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204446.56115: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502bad5b80> <<< 41684 1727204446.56138: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bae60a0> <<< 41684 1727204446.56184: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204446.56188: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 41684 1727204446.56201: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502bad6100> <<< 41684 1727204446.56209: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb19be0> <<< 41684 1727204446.56227: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.56257: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.56269: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 41684 1727204446.56296: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.56412: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.56528: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.56538: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.56559: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 41684 1727204446.56578: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.56609: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.56616: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py <<< 41684 1727204446.56646: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.56798: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.56953: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.57689: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.58388: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py <<< 41684 1727204446.58392: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 41684 1727204446.58405: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 41684 1727204446.58424: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 41684 1727204446.58443: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 41684 1727204446.58646: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502ba1dac0> <<< 41684 1727204446.58742: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bad3d00> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502baca850> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available <<< 41684 1727204446.58835: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 41684 1727204446.58922: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.59104: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bad59d0> # zipimport: zlib available <<< 41684 1727204446.59498: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.59864: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.59914: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.59999: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 41684 1727204446.60023: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.60069: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available <<< 41684 1727204446.60126: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.60185: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/errors.py <<< 41684 1727204446.60223: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41684 1727204446.60226: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py <<< 41684 1727204446.60227: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.60703: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available <<< 41684 1727204446.60933: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 41684 1727204446.60973: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 41684 1727204446.60983: stdout chunk (state=3): >>>import '_ast' # <<< 41684 1727204446.61081: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502b63d310> <<< 41684 1727204446.61089: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.61177: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.61276: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py <<< 41684 1727204446.61280: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py <<< 41684 1727204446.61282: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 41684 1727204446.61302: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.61353: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.61401: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 41684 1727204446.61406: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.61465: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.61511: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.61982: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502bb5e2b0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bad37c0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 41684 1727204446.62107: stdout chunk (state=3): >>># zipimport: zlib available<<< 41684 1727204446.62113: stdout chunk (state=3): >>> <<< 41684 1727204446.62203: stdout chunk (state=3): >>># zipimport: zlib available<<< 41684 1727204446.62208: stdout chunk (state=3): >>> <<< 41684 1727204446.62251: stdout chunk (state=3): >>># zipimport: zlib available<<< 41684 1727204446.62258: stdout chunk (state=3): >>> <<< 41684 1727204446.62319: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py<<< 41684 1727204446.62328: stdout chunk (state=3): >>> <<< 41684 1727204446.62350: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc'<<< 41684 1727204446.62359: stdout chunk (state=3): >>> <<< 41684 1727204446.62391: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py<<< 41684 1727204446.62396: stdout chunk (state=3): >>> <<< 41684 1727204446.62450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc'<<< 41684 1727204446.62456: stdout chunk (state=3): >>> <<< 41684 1727204446.62492: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py<<< 41684 1727204446.62495: stdout chunk (state=3): >>> <<< 41684 1727204446.62522: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc'<<< 41684 1727204446.62525: stdout chunk (state=3): >>> <<< 41684 1727204446.62681: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502b620760><<< 41684 1727204446.62686: stdout chunk (state=3): >>> <<< 41684 1727204446.62770: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502ba14610><<< 41684 1727204446.62775: stdout chunk (state=3): >>> <<< 41684 1727204446.62866: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502ba13b80><<< 41684 1727204446.62879: stdout chunk (state=3): >>> <<< 41684 1727204446.62885: stdout chunk (state=3): >>># destroy ansible.module_utils.distro<<< 41684 1727204446.62902: stdout chunk (state=3): >>> import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py <<< 41684 1727204446.62907: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.62971: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.62985: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 41684 1727204446.63061: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 41684 1727204446.63081: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.63094: stdout chunk (state=3): >>># zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 41684 1727204446.63199: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.63371: stdout chunk (state=3): >>># zipimport: zlib available <<< 41684 1727204446.63503: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 41684 1727204446.63747: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv <<< 41684 1727204446.63751: stdout chunk (state=3): >>># clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse <<< 41684 1727204446.63787: stdout chunk (state=3): >>># cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools <<< 41684 1727204446.63793: stdout chunk (state=3): >>># cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib <<< 41684 1727204446.63838: stdout chunk (state=3): >>># cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 <<< 41684 1727204446.63844: stdout chunk (state=3): >>># cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal <<< 41684 1727204446.63847: stdout chunk (state=3): >>># cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec <<< 41684 1727204446.63855: stdout chunk (state=3): >>># destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file <<< 41684 1727204446.64445: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 41684 1727204446.64474: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 41684 1727204446.64551: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 41684 1727204446.64570: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse # destroy tokenize <<< 41684 1727204446.64585: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat <<< 41684 1727204446.64592: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 41684 1727204446.64622: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 41684 1727204446.64656: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 41684 1727204446.65037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204446.65102: stderr chunk (state=3): >>><<< 41684 1727204446.65105: stdout chunk (state=3): >>><<< 41684 1727204446.65176: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c698dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c63d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c698b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c698ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c63d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c63d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c63d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3cf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3cf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3cf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c655880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3c8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3f2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c63d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c393f10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3990a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c38c5b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3946a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3933d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c316e80> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c316970> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c316f70> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c316dc0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c326130> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c36edf0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3676d0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c37a730> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c39ae80> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c326d30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c36e310> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c37a340> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3a0a30> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c326f10> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c326e50> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c326dc0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c2fa430> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c2fa520> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c32ffa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c329af0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3294c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c22f280> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c2e5dc0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c329f70> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c3a00a0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c23fbb0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c23fee0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c2517f0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c251d30> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c1df460> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c23ffd0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c1ef340> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c251670> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c1ef400> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c326a90> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c20b760> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c20ba30> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c20b820> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c20b910> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c20bd60> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502c2162b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c20b9a0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c1ffaf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c326670> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502c20bb50> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f502bbe6730> # zipimport: found 30 names in '/tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb0d880> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502bb0d160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb0d280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb0dfd0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb0d4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb0ddf0> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502bb0d580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb0d100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502ba64fa0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502ba82c70> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502ba82f70> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502ba82310> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb75dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb753a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb75f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb44e80> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bae0d90> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bae0460> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb17550> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502bae0580> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bae05b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502ba55f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb562b0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502ba527f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb56430> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb6de80> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502ba52790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502ba525e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502ba51550> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502ba51490> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb4e970> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502bad66a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502bad5b80> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bae60a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502bad6100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bb19be0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502ba1dac0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bad3d00> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502baca850> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bad59d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502b63d310> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f502bb5e2b0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502bad37c0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502b620760> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502ba14610> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f502ba13b80> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_i4j83o7a/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 41684 1727204446.65708: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204446.118009-42174-242621941361361/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204446.65711: _low_level_execute_command(): starting 41684 1727204446.65713: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204446.118009-42174-242621941361361/ > /dev/null 2>&1 && sleep 0' 41684 1727204446.65903: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204446.65907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204446.65945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204446.65948: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204446.65950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204446.66009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204446.66012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204446.66081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204446.68514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204446.68580: stderr chunk (state=3): >>><<< 41684 1727204446.68583: stdout chunk (state=3): >>><<< 41684 1727204446.68675: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204446.68678: handler run complete 41684 1727204446.68681: attempt loop complete, returning result 41684 1727204446.68682: _execute() done 41684 1727204446.68684: dumping result to json 41684 1727204446.68686: done dumping result, returning 41684 1727204446.68688: done running TaskExecutor() for managed-node1/TASK: Check if system is ostree [0affcd87-79f5-3839-086d-0000000000d2] 41684 1727204446.68690: sending task result for task 0affcd87-79f5-3839-086d-0000000000d2 41684 1727204446.68748: done sending task result for task 0affcd87-79f5-3839-086d-0000000000d2 41684 1727204446.68750: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 41684 1727204446.68812: no more pending results, returning what we have 41684 1727204446.68815: results queue empty 41684 1727204446.68816: checking for any_errors_fatal 41684 1727204446.68824: done checking for any_errors_fatal 41684 1727204446.68824: checking for max_fail_percentage 41684 1727204446.68826: done checking for max_fail_percentage 41684 1727204446.68826: checking to see if all hosts have failed and the running result is not ok 41684 1727204446.68827: done checking to see if all hosts have failed 41684 1727204446.68828: getting the remaining hosts for this loop 41684 1727204446.68830: done getting the remaining hosts for this loop 41684 1727204446.68833: getting the next task for host managed-node1 41684 1727204446.68839: done getting next task for host managed-node1 41684 1727204446.68842: ^ task is: TASK: Set flag to indicate system is ostree 41684 1727204446.68845: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204446.68848: getting variables 41684 1727204446.68849: in VariableManager get_vars() 41684 1727204446.68883: Calling all_inventory to load vars for managed-node1 41684 1727204446.68886: Calling groups_inventory to load vars for managed-node1 41684 1727204446.68889: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204446.68900: Calling all_plugins_play to load vars for managed-node1 41684 1727204446.68902: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204446.68905: Calling groups_plugins_play to load vars for managed-node1 41684 1727204446.69046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204446.69174: done with get_vars() 41684 1727204446.69182: done getting variables 41684 1727204446.69277: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 15:00:46 -0400 (0:00:00.661) 0:00:03.094 ***** 41684 1727204446.69317: entering _queue_task() for managed-node1/set_fact 41684 1727204446.69319: Creating lock for set_fact 41684 1727204446.69599: worker is 1 (out of 1 available) 41684 1727204446.69609: exiting _queue_task() for managed-node1/set_fact 41684 1727204446.69621: done queuing things up, now waiting for results queue to drain 41684 1727204446.69623: waiting for pending results... 41684 1727204446.69861: running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree 41684 1727204446.69978: in run() - task 0affcd87-79f5-3839-086d-0000000000d3 41684 1727204446.69997: variable 'ansible_search_path' from source: unknown 41684 1727204446.70005: variable 'ansible_search_path' from source: unknown 41684 1727204446.70048: calling self._execute() 41684 1727204446.70135: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204446.70146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204446.70158: variable 'omit' from source: magic vars 41684 1727204446.70639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204446.71064: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204446.71104: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204446.71128: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204446.71153: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204446.71226: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204446.71244: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204446.71265: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204446.71291: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204446.71377: Evaluated conditional (not __network_is_ostree is defined): True 41684 1727204446.71385: variable 'omit' from source: magic vars 41684 1727204446.71417: variable 'omit' from source: magic vars 41684 1727204446.71504: variable '__ostree_booted_stat' from source: set_fact 41684 1727204446.71541: variable 'omit' from source: magic vars 41684 1727204446.71560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204446.71586: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204446.71602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204446.71617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204446.71625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204446.71648: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204446.71651: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204446.71653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204446.71722: Set connection var ansible_connection to ssh 41684 1727204446.71728: Set connection var ansible_pipelining to False 41684 1727204446.71732: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204446.71740: Set connection var ansible_timeout to 10 41684 1727204446.71746: Set connection var ansible_shell_executable to /bin/sh 41684 1727204446.71749: Set connection var ansible_shell_type to sh 41684 1727204446.71770: variable 'ansible_shell_executable' from source: unknown 41684 1727204446.71773: variable 'ansible_connection' from source: unknown 41684 1727204446.71776: variable 'ansible_module_compression' from source: unknown 41684 1727204446.71778: variable 'ansible_shell_type' from source: unknown 41684 1727204446.71780: variable 'ansible_shell_executable' from source: unknown 41684 1727204446.71782: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204446.71785: variable 'ansible_pipelining' from source: unknown 41684 1727204446.71788: variable 'ansible_timeout' from source: unknown 41684 1727204446.71792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204446.71871: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204446.71879: variable 'omit' from source: magic vars 41684 1727204446.71883: starting attempt loop 41684 1727204446.71886: running the handler 41684 1727204446.71894: handler run complete 41684 1727204446.71902: attempt loop complete, returning result 41684 1727204446.71904: _execute() done 41684 1727204446.71907: dumping result to json 41684 1727204446.71909: done dumping result, returning 41684 1727204446.71916: done running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree [0affcd87-79f5-3839-086d-0000000000d3] 41684 1727204446.71925: sending task result for task 0affcd87-79f5-3839-086d-0000000000d3 41684 1727204446.72006: done sending task result for task 0affcd87-79f5-3839-086d-0000000000d3 41684 1727204446.72009: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 41684 1727204446.72078: no more pending results, returning what we have 41684 1727204446.72081: results queue empty 41684 1727204446.72082: checking for any_errors_fatal 41684 1727204446.72087: done checking for any_errors_fatal 41684 1727204446.72088: checking for max_fail_percentage 41684 1727204446.72089: done checking for max_fail_percentage 41684 1727204446.72090: checking to see if all hosts have failed and the running result is not ok 41684 1727204446.72091: done checking to see if all hosts have failed 41684 1727204446.72091: getting the remaining hosts for this loop 41684 1727204446.72093: done getting the remaining hosts for this loop 41684 1727204446.72097: getting the next task for host managed-node1 41684 1727204446.72106: done getting next task for host managed-node1 41684 1727204446.72109: ^ task is: TASK: Fix CentOS6 Base repo 41684 1727204446.72111: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204446.72115: getting variables 41684 1727204446.72116: in VariableManager get_vars() 41684 1727204446.72143: Calling all_inventory to load vars for managed-node1 41684 1727204446.72146: Calling groups_inventory to load vars for managed-node1 41684 1727204446.72149: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204446.72158: Calling all_plugins_play to load vars for managed-node1 41684 1727204446.72161: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204446.72172: Calling groups_plugins_play to load vars for managed-node1 41684 1727204446.72435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204446.72608: done with get_vars() 41684 1727204446.72617: done getting variables 41684 1727204446.72727: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 15:00:46 -0400 (0:00:00.034) 0:00:03.129 ***** 41684 1727204446.72753: entering _queue_task() for managed-node1/copy 41684 1727204446.73016: worker is 1 (out of 1 available) 41684 1727204446.73028: exiting _queue_task() for managed-node1/copy 41684 1727204446.73040: done queuing things up, now waiting for results queue to drain 41684 1727204446.73041: waiting for pending results... 41684 1727204446.73287: running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo 41684 1727204446.73398: in run() - task 0affcd87-79f5-3839-086d-0000000000d5 41684 1727204446.73416: variable 'ansible_search_path' from source: unknown 41684 1727204446.73423: variable 'ansible_search_path' from source: unknown 41684 1727204446.73463: calling self._execute() 41684 1727204446.73544: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204446.73554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204446.73571: variable 'omit' from source: magic vars 41684 1727204446.74043: variable 'ansible_distribution' from source: facts 41684 1727204446.74072: Evaluated conditional (ansible_distribution == 'CentOS'): True 41684 1727204446.74190: variable 'ansible_distribution_major_version' from source: facts 41684 1727204446.74202: Evaluated conditional (ansible_distribution_major_version == '6'): False 41684 1727204446.74208: when evaluation is False, skipping this task 41684 1727204446.74214: _execute() done 41684 1727204446.74220: dumping result to json 41684 1727204446.74225: done dumping result, returning 41684 1727204446.74235: done running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo [0affcd87-79f5-3839-086d-0000000000d5] 41684 1727204446.74246: sending task result for task 0affcd87-79f5-3839-086d-0000000000d5 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 41684 1727204446.74417: no more pending results, returning what we have 41684 1727204446.74421: results queue empty 41684 1727204446.74422: checking for any_errors_fatal 41684 1727204446.74430: done checking for any_errors_fatal 41684 1727204446.74431: checking for max_fail_percentage 41684 1727204446.74432: done checking for max_fail_percentage 41684 1727204446.74433: checking to see if all hosts have failed and the running result is not ok 41684 1727204446.74434: done checking to see if all hosts have failed 41684 1727204446.74434: getting the remaining hosts for this loop 41684 1727204446.74436: done getting the remaining hosts for this loop 41684 1727204446.74441: getting the next task for host managed-node1 41684 1727204446.74448: done getting next task for host managed-node1 41684 1727204446.74452: ^ task is: TASK: Include the task 'enable_epel.yml' 41684 1727204446.74455: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204446.74458: getting variables 41684 1727204446.74460: in VariableManager get_vars() 41684 1727204446.74491: Calling all_inventory to load vars for managed-node1 41684 1727204446.74494: Calling groups_inventory to load vars for managed-node1 41684 1727204446.74498: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204446.74511: Calling all_plugins_play to load vars for managed-node1 41684 1727204446.74513: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204446.74516: Calling groups_plugins_play to load vars for managed-node1 41684 1727204446.74719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204446.74919: done with get_vars() 41684 1727204446.74930: done getting variables 41684 1727204446.75235: done sending task result for task 0affcd87-79f5-3839-086d-0000000000d5 41684 1727204446.75238: WORKER PROCESS EXITING TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 15:00:46 -0400 (0:00:00.025) 0:00:03.154 ***** 41684 1727204446.75314: entering _queue_task() for managed-node1/include_tasks 41684 1727204446.75562: worker is 1 (out of 1 available) 41684 1727204446.75575: exiting _queue_task() for managed-node1/include_tasks 41684 1727204446.75587: done queuing things up, now waiting for results queue to drain 41684 1727204446.75588: waiting for pending results... 41684 1727204446.75829: running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' 41684 1727204446.75944: in run() - task 0affcd87-79f5-3839-086d-0000000000d6 41684 1727204446.75962: variable 'ansible_search_path' from source: unknown 41684 1727204446.75975: variable 'ansible_search_path' from source: unknown 41684 1727204446.76017: calling self._execute() 41684 1727204446.76097: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204446.76109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204446.76122: variable 'omit' from source: magic vars 41684 1727204446.76718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204446.79014: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204446.79098: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204446.79138: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204446.79182: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204446.79214: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204446.79334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204446.79369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204446.79406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204446.79452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204446.79476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204446.79600: variable '__network_is_ostree' from source: set_fact 41684 1727204446.79630: Evaluated conditional (not __network_is_ostree | d(false)): True 41684 1727204446.79643: _execute() done 41684 1727204446.79652: dumping result to json 41684 1727204446.79658: done dumping result, returning 41684 1727204446.79672: done running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' [0affcd87-79f5-3839-086d-0000000000d6] 41684 1727204446.79682: sending task result for task 0affcd87-79f5-3839-086d-0000000000d6 41684 1727204446.79809: no more pending results, returning what we have 41684 1727204446.79815: in VariableManager get_vars() 41684 1727204446.79850: Calling all_inventory to load vars for managed-node1 41684 1727204446.79853: Calling groups_inventory to load vars for managed-node1 41684 1727204446.79857: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204446.79871: Calling all_plugins_play to load vars for managed-node1 41684 1727204446.79874: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204446.79878: Calling groups_plugins_play to load vars for managed-node1 41684 1727204446.80118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204446.80626: done with get_vars() 41684 1727204446.80635: variable 'ansible_search_path' from source: unknown 41684 1727204446.80636: variable 'ansible_search_path' from source: unknown 41684 1727204446.80680: we have included files to process 41684 1727204446.80681: generating all_blocks data 41684 1727204446.80683: done generating all_blocks data 41684 1727204446.80689: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 41684 1727204446.80691: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 41684 1727204446.80694: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 41684 1727204446.81301: done sending task result for task 0affcd87-79f5-3839-086d-0000000000d6 41684 1727204446.81305: WORKER PROCESS EXITING 41684 1727204446.81716: done processing included file 41684 1727204446.81719: iterating over new_blocks loaded from include file 41684 1727204446.81720: in VariableManager get_vars() 41684 1727204446.81734: done with get_vars() 41684 1727204446.81736: filtering new block on tags 41684 1727204446.81760: done filtering new block on tags 41684 1727204446.81765: in VariableManager get_vars() 41684 1727204446.81778: done with get_vars() 41684 1727204446.81780: filtering new block on tags 41684 1727204446.81792: done filtering new block on tags 41684 1727204446.81794: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node1 41684 1727204446.81800: extending task lists for all hosts with included blocks 41684 1727204446.81903: done extending task lists 41684 1727204446.81904: done processing included files 41684 1727204446.81905: results queue empty 41684 1727204446.81906: checking for any_errors_fatal 41684 1727204446.81909: done checking for any_errors_fatal 41684 1727204446.81910: checking for max_fail_percentage 41684 1727204446.81911: done checking for max_fail_percentage 41684 1727204446.81912: checking to see if all hosts have failed and the running result is not ok 41684 1727204446.81913: done checking to see if all hosts have failed 41684 1727204446.81913: getting the remaining hosts for this loop 41684 1727204446.81915: done getting the remaining hosts for this loop 41684 1727204446.81917: getting the next task for host managed-node1 41684 1727204446.81921: done getting next task for host managed-node1 41684 1727204446.81923: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 41684 1727204446.81926: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204446.81928: getting variables 41684 1727204446.81929: in VariableManager get_vars() 41684 1727204446.81937: Calling all_inventory to load vars for managed-node1 41684 1727204446.81940: Calling groups_inventory to load vars for managed-node1 41684 1727204446.81942: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204446.81947: Calling all_plugins_play to load vars for managed-node1 41684 1727204446.81956: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204446.81959: Calling groups_plugins_play to load vars for managed-node1 41684 1727204446.82135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204446.82345: done with get_vars() 41684 1727204446.82355: done getting variables 41684 1727204446.82424: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 41684 1727204446.82630: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 15:00:46 -0400 (0:00:00.073) 0:00:03.228 ***** 41684 1727204446.82679: entering _queue_task() for managed-node1/command 41684 1727204446.82680: Creating lock for command 41684 1727204446.83745: worker is 1 (out of 1 available) 41684 1727204446.83757: exiting _queue_task() for managed-node1/command 41684 1727204446.83770: done queuing things up, now waiting for results queue to drain 41684 1727204446.83771: waiting for pending results... 41684 1727204446.84087: running TaskExecutor() for managed-node1/TASK: Create EPEL 9 41684 1727204446.84227: in run() - task 0affcd87-79f5-3839-086d-0000000000f0 41684 1727204446.84247: variable 'ansible_search_path' from source: unknown 41684 1727204446.84254: variable 'ansible_search_path' from source: unknown 41684 1727204446.84298: calling self._execute() 41684 1727204446.84379: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204446.84390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204446.84403: variable 'omit' from source: magic vars 41684 1727204446.84786: variable 'ansible_distribution' from source: facts 41684 1727204446.84803: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 41684 1727204446.84938: variable 'ansible_distribution_major_version' from source: facts 41684 1727204446.84950: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 41684 1727204446.84960: when evaluation is False, skipping this task 41684 1727204446.84973: _execute() done 41684 1727204446.84983: dumping result to json 41684 1727204446.84991: done dumping result, returning 41684 1727204446.85002: done running TaskExecutor() for managed-node1/TASK: Create EPEL 9 [0affcd87-79f5-3839-086d-0000000000f0] 41684 1727204446.85014: sending task result for task 0affcd87-79f5-3839-086d-0000000000f0 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 41684 1727204446.85185: no more pending results, returning what we have 41684 1727204446.85189: results queue empty 41684 1727204446.85191: checking for any_errors_fatal 41684 1727204446.85193: done checking for any_errors_fatal 41684 1727204446.85193: checking for max_fail_percentage 41684 1727204446.85195: done checking for max_fail_percentage 41684 1727204446.85196: checking to see if all hosts have failed and the running result is not ok 41684 1727204446.85197: done checking to see if all hosts have failed 41684 1727204446.85197: getting the remaining hosts for this loop 41684 1727204446.85199: done getting the remaining hosts for this loop 41684 1727204446.85203: getting the next task for host managed-node1 41684 1727204446.85211: done getting next task for host managed-node1 41684 1727204446.85214: ^ task is: TASK: Install yum-utils package 41684 1727204446.85218: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204446.85222: getting variables 41684 1727204446.85224: in VariableManager get_vars() 41684 1727204446.85258: Calling all_inventory to load vars for managed-node1 41684 1727204446.85261: Calling groups_inventory to load vars for managed-node1 41684 1727204446.85272: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204446.85286: Calling all_plugins_play to load vars for managed-node1 41684 1727204446.85289: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204446.85292: Calling groups_plugins_play to load vars for managed-node1 41684 1727204446.85551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204446.85772: done with get_vars() 41684 1727204446.85789: done getting variables 41684 1727204446.85933: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 15:00:46 -0400 (0:00:00.032) 0:00:03.261 ***** 41684 1727204446.85972: entering _queue_task() for managed-node1/package 41684 1727204446.85978: Creating lock for package 41684 1727204446.86027: done sending task result for task 0affcd87-79f5-3839-086d-0000000000f0 41684 1727204446.86039: WORKER PROCESS EXITING 41684 1727204446.86622: worker is 1 (out of 1 available) 41684 1727204446.86634: exiting _queue_task() for managed-node1/package 41684 1727204446.86647: done queuing things up, now waiting for results queue to drain 41684 1727204446.86648: waiting for pending results... 41684 1727204446.87288: running TaskExecutor() for managed-node1/TASK: Install yum-utils package 41684 1727204446.87515: in run() - task 0affcd87-79f5-3839-086d-0000000000f1 41684 1727204446.87647: variable 'ansible_search_path' from source: unknown 41684 1727204446.87656: variable 'ansible_search_path' from source: unknown 41684 1727204446.87699: calling self._execute() 41684 1727204446.87777: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204446.87847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204446.87863: variable 'omit' from source: magic vars 41684 1727204446.88308: variable 'ansible_distribution' from source: facts 41684 1727204446.88327: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 41684 1727204446.88466: variable 'ansible_distribution_major_version' from source: facts 41684 1727204446.88478: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 41684 1727204446.88485: when evaluation is False, skipping this task 41684 1727204446.88493: _execute() done 41684 1727204446.88502: dumping result to json 41684 1727204446.88509: done dumping result, returning 41684 1727204446.88519: done running TaskExecutor() for managed-node1/TASK: Install yum-utils package [0affcd87-79f5-3839-086d-0000000000f1] 41684 1727204446.88529: sending task result for task 0affcd87-79f5-3839-086d-0000000000f1 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 41684 1727204446.88683: no more pending results, returning what we have 41684 1727204446.88688: results queue empty 41684 1727204446.88689: checking for any_errors_fatal 41684 1727204446.88698: done checking for any_errors_fatal 41684 1727204446.88699: checking for max_fail_percentage 41684 1727204446.88701: done checking for max_fail_percentage 41684 1727204446.88701: checking to see if all hosts have failed and the running result is not ok 41684 1727204446.88702: done checking to see if all hosts have failed 41684 1727204446.88703: getting the remaining hosts for this loop 41684 1727204446.88705: done getting the remaining hosts for this loop 41684 1727204446.88710: getting the next task for host managed-node1 41684 1727204446.88718: done getting next task for host managed-node1 41684 1727204446.88721: ^ task is: TASK: Enable EPEL 7 41684 1727204446.88725: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204446.88728: getting variables 41684 1727204446.88731: in VariableManager get_vars() 41684 1727204446.88762: Calling all_inventory to load vars for managed-node1 41684 1727204446.88767: Calling groups_inventory to load vars for managed-node1 41684 1727204446.88771: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204446.88784: Calling all_plugins_play to load vars for managed-node1 41684 1727204446.88787: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204446.88791: Calling groups_plugins_play to load vars for managed-node1 41684 1727204446.88988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204446.89204: done with get_vars() 41684 1727204446.89216: done getting variables 41684 1727204446.89529: done sending task result for task 0affcd87-79f5-3839-086d-0000000000f1 41684 1727204446.89533: WORKER PROCESS EXITING 41684 1727204446.89561: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 15:00:46 -0400 (0:00:00.036) 0:00:03.297 ***** 41684 1727204446.89592: entering _queue_task() for managed-node1/command 41684 1727204446.89963: worker is 1 (out of 1 available) 41684 1727204446.89975: exiting _queue_task() for managed-node1/command 41684 1727204446.89987: done queuing things up, now waiting for results queue to drain 41684 1727204446.89988: waiting for pending results... 41684 1727204446.90437: running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 41684 1727204446.90553: in run() - task 0affcd87-79f5-3839-086d-0000000000f2 41684 1727204446.90633: variable 'ansible_search_path' from source: unknown 41684 1727204446.90639: variable 'ansible_search_path' from source: unknown 41684 1727204446.90681: calling self._execute() 41684 1727204446.90800: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204446.90953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204446.90970: variable 'omit' from source: magic vars 41684 1727204446.91699: variable 'ansible_distribution' from source: facts 41684 1727204446.91832: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 41684 1727204446.92076: variable 'ansible_distribution_major_version' from source: facts 41684 1727204446.92087: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 41684 1727204446.92094: when evaluation is False, skipping this task 41684 1727204446.92101: _execute() done 41684 1727204446.92107: dumping result to json 41684 1727204446.92112: done dumping result, returning 41684 1727204446.92121: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 [0affcd87-79f5-3839-086d-0000000000f2] 41684 1727204446.92131: sending task result for task 0affcd87-79f5-3839-086d-0000000000f2 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 41684 1727204446.92297: no more pending results, returning what we have 41684 1727204446.92301: results queue empty 41684 1727204446.92302: checking for any_errors_fatal 41684 1727204446.92310: done checking for any_errors_fatal 41684 1727204446.92311: checking for max_fail_percentage 41684 1727204446.92312: done checking for max_fail_percentage 41684 1727204446.92313: checking to see if all hosts have failed and the running result is not ok 41684 1727204446.92314: done checking to see if all hosts have failed 41684 1727204446.92315: getting the remaining hosts for this loop 41684 1727204446.92316: done getting the remaining hosts for this loop 41684 1727204446.92323: getting the next task for host managed-node1 41684 1727204446.92330: done getting next task for host managed-node1 41684 1727204446.92333: ^ task is: TASK: Enable EPEL 8 41684 1727204446.92337: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204446.92341: getting variables 41684 1727204446.92342: in VariableManager get_vars() 41684 1727204446.92376: Calling all_inventory to load vars for managed-node1 41684 1727204446.92379: Calling groups_inventory to load vars for managed-node1 41684 1727204446.92383: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204446.92396: Calling all_plugins_play to load vars for managed-node1 41684 1727204446.92398: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204446.92401: Calling groups_plugins_play to load vars for managed-node1 41684 1727204446.92628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204446.92834: done with get_vars() 41684 1727204446.92845: done getting variables 41684 1727204446.92954: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 15:00:46 -0400 (0:00:00.033) 0:00:03.331 ***** 41684 1727204446.92991: entering _queue_task() for managed-node1/command 41684 1727204446.93013: done sending task result for task 0affcd87-79f5-3839-086d-0000000000f2 41684 1727204446.93021: WORKER PROCESS EXITING 41684 1727204446.93485: worker is 1 (out of 1 available) 41684 1727204446.93498: exiting _queue_task() for managed-node1/command 41684 1727204446.93509: done queuing things up, now waiting for results queue to drain 41684 1727204446.93511: waiting for pending results... 41684 1727204446.94818: running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 41684 1727204446.94996: in run() - task 0affcd87-79f5-3839-086d-0000000000f3 41684 1727204446.95182: variable 'ansible_search_path' from source: unknown 41684 1727204446.95186: variable 'ansible_search_path' from source: unknown 41684 1727204446.95221: calling self._execute() 41684 1727204446.95602: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204446.95617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204446.95633: variable 'omit' from source: magic vars 41684 1727204446.96607: variable 'ansible_distribution' from source: facts 41684 1727204446.96685: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 41684 1727204446.96943: variable 'ansible_distribution_major_version' from source: facts 41684 1727204446.97000: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 41684 1727204446.97023: when evaluation is False, skipping this task 41684 1727204446.97098: _execute() done 41684 1727204446.97111: dumping result to json 41684 1727204446.97122: done dumping result, returning 41684 1727204446.97137: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 [0affcd87-79f5-3839-086d-0000000000f3] 41684 1727204446.97150: sending task result for task 0affcd87-79f5-3839-086d-0000000000f3 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 41684 1727204446.97339: no more pending results, returning what we have 41684 1727204446.97343: results queue empty 41684 1727204446.97344: checking for any_errors_fatal 41684 1727204446.97353: done checking for any_errors_fatal 41684 1727204446.97354: checking for max_fail_percentage 41684 1727204446.97355: done checking for max_fail_percentage 41684 1727204446.97356: checking to see if all hosts have failed and the running result is not ok 41684 1727204446.97357: done checking to see if all hosts have failed 41684 1727204446.97358: getting the remaining hosts for this loop 41684 1727204446.97359: done getting the remaining hosts for this loop 41684 1727204446.97368: getting the next task for host managed-node1 41684 1727204446.97379: done getting next task for host managed-node1 41684 1727204446.97382: ^ task is: TASK: Enable EPEL 6 41684 1727204446.97386: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204446.97390: getting variables 41684 1727204446.97392: in VariableManager get_vars() 41684 1727204446.97425: Calling all_inventory to load vars for managed-node1 41684 1727204446.97428: Calling groups_inventory to load vars for managed-node1 41684 1727204446.97432: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204446.97446: Calling all_plugins_play to load vars for managed-node1 41684 1727204446.97449: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204446.97452: Calling groups_plugins_play to load vars for managed-node1 41684 1727204446.97682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204446.97928: done with get_vars() 41684 1727204446.97940: done getting variables 41684 1727204446.98219: done sending task result for task 0affcd87-79f5-3839-086d-0000000000f3 41684 1727204446.98222: WORKER PROCESS EXITING 41684 1727204446.98253: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 15:00:46 -0400 (0:00:00.053) 0:00:03.384 ***** 41684 1727204446.98296: entering _queue_task() for managed-node1/copy 41684 1727204446.98736: worker is 1 (out of 1 available) 41684 1727204446.98748: exiting _queue_task() for managed-node1/copy 41684 1727204446.98760: done queuing things up, now waiting for results queue to drain 41684 1727204446.98761: waiting for pending results... 41684 1727204446.99038: running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 41684 1727204446.99160: in run() - task 0affcd87-79f5-3839-086d-0000000000f5 41684 1727204446.99182: variable 'ansible_search_path' from source: unknown 41684 1727204446.99189: variable 'ansible_search_path' from source: unknown 41684 1727204446.99238: calling self._execute() 41684 1727204446.99319: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204446.99336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204446.99348: variable 'omit' from source: magic vars 41684 1727204446.99889: variable 'ansible_distribution' from source: facts 41684 1727204446.99918: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 41684 1727204447.00868: variable 'ansible_distribution_major_version' from source: facts 41684 1727204447.00900: Evaluated conditional (ansible_distribution_major_version == '6'): False 41684 1727204447.00909: when evaluation is False, skipping this task 41684 1727204447.00918: _execute() done 41684 1727204447.00925: dumping result to json 41684 1727204447.00932: done dumping result, returning 41684 1727204447.00943: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 [0affcd87-79f5-3839-086d-0000000000f5] 41684 1727204447.00953: sending task result for task 0affcd87-79f5-3839-086d-0000000000f5 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 41684 1727204447.01352: no more pending results, returning what we have 41684 1727204447.01356: results queue empty 41684 1727204447.01357: checking for any_errors_fatal 41684 1727204447.01365: done checking for any_errors_fatal 41684 1727204447.01366: checking for max_fail_percentage 41684 1727204447.01368: done checking for max_fail_percentage 41684 1727204447.01368: checking to see if all hosts have failed and the running result is not ok 41684 1727204447.01369: done checking to see if all hosts have failed 41684 1727204447.01370: getting the remaining hosts for this loop 41684 1727204447.01372: done getting the remaining hosts for this loop 41684 1727204447.01376: getting the next task for host managed-node1 41684 1727204447.01386: done getting next task for host managed-node1 41684 1727204447.01389: ^ task is: TASK: Set network provider to 'nm' 41684 1727204447.01391: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204447.01398: getting variables 41684 1727204447.01400: in VariableManager get_vars() 41684 1727204447.01431: Calling all_inventory to load vars for managed-node1 41684 1727204447.01434: Calling groups_inventory to load vars for managed-node1 41684 1727204447.01438: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204447.01451: Calling all_plugins_play to load vars for managed-node1 41684 1727204447.01454: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204447.01457: Calling groups_plugins_play to load vars for managed-node1 41684 1727204447.01714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204447.01944: done with get_vars() 41684 1727204447.01955: done getting variables 41684 1727204447.02130: done sending task result for task 0affcd87-79f5-3839-086d-0000000000f5 41684 1727204447.02134: WORKER PROCESS EXITING 41684 1727204447.02171: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:13 Tuesday 24 September 2024 15:00:47 -0400 (0:00:00.039) 0:00:03.423 ***** 41684 1727204447.02208: entering _queue_task() for managed-node1/set_fact 41684 1727204447.02683: worker is 1 (out of 1 available) 41684 1727204447.02696: exiting _queue_task() for managed-node1/set_fact 41684 1727204447.02708: done queuing things up, now waiting for results queue to drain 41684 1727204447.02709: waiting for pending results... 41684 1727204447.02982: running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' 41684 1727204447.03088: in run() - task 0affcd87-79f5-3839-086d-000000000007 41684 1727204447.03119: variable 'ansible_search_path' from source: unknown 41684 1727204447.03170: calling self._execute() 41684 1727204447.03268: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204447.03280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204447.03295: variable 'omit' from source: magic vars 41684 1727204447.03421: variable 'omit' from source: magic vars 41684 1727204447.03485: variable 'omit' from source: magic vars 41684 1727204447.03525: variable 'omit' from source: magic vars 41684 1727204447.03588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204447.03633: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204447.03674: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204447.03699: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204447.03719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204447.03763: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204447.03797: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204447.03807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204447.03952: Set connection var ansible_connection to ssh 41684 1727204447.03966: Set connection var ansible_pipelining to False 41684 1727204447.03986: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204447.03995: Set connection var ansible_timeout to 10 41684 1727204447.04007: Set connection var ansible_shell_executable to /bin/sh 41684 1727204447.04014: Set connection var ansible_shell_type to sh 41684 1727204447.04051: variable 'ansible_shell_executable' from source: unknown 41684 1727204447.04061: variable 'ansible_connection' from source: unknown 41684 1727204447.04072: variable 'ansible_module_compression' from source: unknown 41684 1727204447.04089: variable 'ansible_shell_type' from source: unknown 41684 1727204447.04101: variable 'ansible_shell_executable' from source: unknown 41684 1727204447.04109: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204447.04117: variable 'ansible_pipelining' from source: unknown 41684 1727204447.04125: variable 'ansible_timeout' from source: unknown 41684 1727204447.04134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204447.04310: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204447.04331: variable 'omit' from source: magic vars 41684 1727204447.04341: starting attempt loop 41684 1727204447.04348: running the handler 41684 1727204447.04372: handler run complete 41684 1727204447.04388: attempt loop complete, returning result 41684 1727204447.04396: _execute() done 41684 1727204447.04411: dumping result to json 41684 1727204447.04419: done dumping result, returning 41684 1727204447.04438: done running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' [0affcd87-79f5-3839-086d-000000000007] 41684 1727204447.04451: sending task result for task 0affcd87-79f5-3839-086d-000000000007 41684 1727204447.04565: done sending task result for task 0affcd87-79f5-3839-086d-000000000007 ok: [managed-node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 41684 1727204447.04621: no more pending results, returning what we have 41684 1727204447.04625: results queue empty 41684 1727204447.04626: checking for any_errors_fatal 41684 1727204447.04634: done checking for any_errors_fatal 41684 1727204447.04634: checking for max_fail_percentage 41684 1727204447.04636: done checking for max_fail_percentage 41684 1727204447.04637: checking to see if all hosts have failed and the running result is not ok 41684 1727204447.04638: done checking to see if all hosts have failed 41684 1727204447.04639: getting the remaining hosts for this loop 41684 1727204447.04641: done getting the remaining hosts for this loop 41684 1727204447.04645: getting the next task for host managed-node1 41684 1727204447.04653: done getting next task for host managed-node1 41684 1727204447.04655: ^ task is: TASK: meta (flush_handlers) 41684 1727204447.04659: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204447.04664: getting variables 41684 1727204447.04666: in VariableManager get_vars() 41684 1727204447.04701: Calling all_inventory to load vars for managed-node1 41684 1727204447.04704: Calling groups_inventory to load vars for managed-node1 41684 1727204447.04708: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204447.04720: Calling all_plugins_play to load vars for managed-node1 41684 1727204447.04723: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204447.04727: Calling groups_plugins_play to load vars for managed-node1 41684 1727204447.04933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204447.05161: done with get_vars() 41684 1727204447.05177: done getting variables 41684 1727204447.05260: in VariableManager get_vars() 41684 1727204447.05273: Calling all_inventory to load vars for managed-node1 41684 1727204447.05276: Calling groups_inventory to load vars for managed-node1 41684 1727204447.05278: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204447.05283: Calling all_plugins_play to load vars for managed-node1 41684 1727204447.05286: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204447.05289: Calling groups_plugins_play to load vars for managed-node1 41684 1727204447.05794: WORKER PROCESS EXITING 41684 1727204447.05816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204447.06048: done with get_vars() 41684 1727204447.06066: done queuing things up, now waiting for results queue to drain 41684 1727204447.06069: results queue empty 41684 1727204447.06069: checking for any_errors_fatal 41684 1727204447.06072: done checking for any_errors_fatal 41684 1727204447.06073: checking for max_fail_percentage 41684 1727204447.06074: done checking for max_fail_percentage 41684 1727204447.06075: checking to see if all hosts have failed and the running result is not ok 41684 1727204447.06075: done checking to see if all hosts have failed 41684 1727204447.06076: getting the remaining hosts for this loop 41684 1727204447.06077: done getting the remaining hosts for this loop 41684 1727204447.06080: getting the next task for host managed-node1 41684 1727204447.06090: done getting next task for host managed-node1 41684 1727204447.06092: ^ task is: TASK: meta (flush_handlers) 41684 1727204447.06097: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204447.06106: getting variables 41684 1727204447.06108: in VariableManager get_vars() 41684 1727204447.06117: Calling all_inventory to load vars for managed-node1 41684 1727204447.06120: Calling groups_inventory to load vars for managed-node1 41684 1727204447.06122: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204447.06127: Calling all_plugins_play to load vars for managed-node1 41684 1727204447.06129: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204447.06132: Calling groups_plugins_play to load vars for managed-node1 41684 1727204447.06293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204447.06644: done with get_vars() 41684 1727204447.06654: done getting variables 41684 1727204447.06709: in VariableManager get_vars() 41684 1727204447.06718: Calling all_inventory to load vars for managed-node1 41684 1727204447.06721: Calling groups_inventory to load vars for managed-node1 41684 1727204447.06723: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204447.06728: Calling all_plugins_play to load vars for managed-node1 41684 1727204447.06730: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204447.06733: Calling groups_plugins_play to load vars for managed-node1 41684 1727204447.07158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204447.07605: done with get_vars() 41684 1727204447.07753: done queuing things up, now waiting for results queue to drain 41684 1727204447.07755: results queue empty 41684 1727204447.07756: checking for any_errors_fatal 41684 1727204447.07757: done checking for any_errors_fatal 41684 1727204447.07758: checking for max_fail_percentage 41684 1727204447.07760: done checking for max_fail_percentage 41684 1727204447.07760: checking to see if all hosts have failed and the running result is not ok 41684 1727204447.07761: done checking to see if all hosts have failed 41684 1727204447.07762: getting the remaining hosts for this loop 41684 1727204447.07763: done getting the remaining hosts for this loop 41684 1727204447.07767: getting the next task for host managed-node1 41684 1727204447.07771: done getting next task for host managed-node1 41684 1727204447.07772: ^ task is: None 41684 1727204447.07773: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204447.07774: done queuing things up, now waiting for results queue to drain 41684 1727204447.07775: results queue empty 41684 1727204447.07776: checking for any_errors_fatal 41684 1727204447.07777: done checking for any_errors_fatal 41684 1727204447.07777: checking for max_fail_percentage 41684 1727204447.07778: done checking for max_fail_percentage 41684 1727204447.07779: checking to see if all hosts have failed and the running result is not ok 41684 1727204447.07780: done checking to see if all hosts have failed 41684 1727204447.07782: getting the next task for host managed-node1 41684 1727204447.07784: done getting next task for host managed-node1 41684 1727204447.07785: ^ task is: None 41684 1727204447.07786: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204447.07967: in VariableManager get_vars() 41684 1727204447.07994: done with get_vars() 41684 1727204447.08000: in VariableManager get_vars() 41684 1727204447.08015: done with get_vars() 41684 1727204447.08019: variable 'omit' from source: magic vars 41684 1727204447.08124: in VariableManager get_vars() 41684 1727204447.08187: done with get_vars() 41684 1727204447.08212: variable 'omit' from source: magic vars PLAY [Test output device of routes] ******************************************** 41684 1727204447.08750: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 41684 1727204447.08783: getting the remaining hosts for this loop 41684 1727204447.08785: done getting the remaining hosts for this loop 41684 1727204447.08788: getting the next task for host managed-node1 41684 1727204447.08791: done getting next task for host managed-node1 41684 1727204447.08793: ^ task is: TASK: Gathering Facts 41684 1727204447.08795: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204447.08797: getting variables 41684 1727204447.08798: in VariableManager get_vars() 41684 1727204447.08815: Calling all_inventory to load vars for managed-node1 41684 1727204447.08817: Calling groups_inventory to load vars for managed-node1 41684 1727204447.08820: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204447.08826: Calling all_plugins_play to load vars for managed-node1 41684 1727204447.08842: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204447.08845: Calling groups_plugins_play to load vars for managed-node1 41684 1727204447.09011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204447.09263: done with get_vars() 41684 1727204447.09273: done getting variables 41684 1727204447.09321: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:3 Tuesday 24 September 2024 15:00:47 -0400 (0:00:00.071) 0:00:03.494 ***** 41684 1727204447.09343: entering _queue_task() for managed-node1/gather_facts 41684 1727204447.10297: worker is 1 (out of 1 available) 41684 1727204447.10423: exiting _queue_task() for managed-node1/gather_facts 41684 1727204447.10437: done queuing things up, now waiting for results queue to drain 41684 1727204447.10438: waiting for pending results... 41684 1727204447.11432: running TaskExecutor() for managed-node1/TASK: Gathering Facts 41684 1727204447.11520: in run() - task 0affcd87-79f5-3839-086d-00000000011b 41684 1727204447.11540: variable 'ansible_search_path' from source: unknown 41684 1727204447.11591: calling self._execute() 41684 1727204447.11680: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204447.11692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204447.11704: variable 'omit' from source: magic vars 41684 1727204447.12489: variable 'ansible_distribution_major_version' from source: facts 41684 1727204447.12510: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204447.12520: variable 'omit' from source: magic vars 41684 1727204447.12549: variable 'omit' from source: magic vars 41684 1727204447.12595: variable 'omit' from source: magic vars 41684 1727204447.12641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204447.12890: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204447.12917: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204447.12939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204447.12953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204447.12991: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204447.12999: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204447.13013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204447.13117: Set connection var ansible_connection to ssh 41684 1727204447.13179: Set connection var ansible_pipelining to False 41684 1727204447.13189: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204447.13200: Set connection var ansible_timeout to 10 41684 1727204447.13480: Set connection var ansible_shell_executable to /bin/sh 41684 1727204447.13488: Set connection var ansible_shell_type to sh 41684 1727204447.13520: variable 'ansible_shell_executable' from source: unknown 41684 1727204447.13528: variable 'ansible_connection' from source: unknown 41684 1727204447.13535: variable 'ansible_module_compression' from source: unknown 41684 1727204447.13541: variable 'ansible_shell_type' from source: unknown 41684 1727204447.13547: variable 'ansible_shell_executable' from source: unknown 41684 1727204447.13553: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204447.13560: variable 'ansible_pipelining' from source: unknown 41684 1727204447.13573: variable 'ansible_timeout' from source: unknown 41684 1727204447.13582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204447.13767: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204447.13784: variable 'omit' from source: magic vars 41684 1727204447.13792: starting attempt loop 41684 1727204447.13874: running the handler 41684 1727204447.13895: variable 'ansible_facts' from source: unknown 41684 1727204447.13918: _low_level_execute_command(): starting 41684 1727204447.13929: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204447.15335: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204447.15340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204447.15382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204447.15386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204447.15389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204447.15588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204447.15591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204447.15640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204447.15744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204447.17285: stdout chunk (state=3): >>>/root <<< 41684 1727204447.17377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204447.17470: stderr chunk (state=3): >>><<< 41684 1727204447.17473: stdout chunk (state=3): >>><<< 41684 1727204447.17592: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204447.17595: _low_level_execute_command(): starting 41684 1727204447.17599: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204447.1749563-42226-24439077889401 `" && echo ansible-tmp-1727204447.1749563-42226-24439077889401="` echo /root/.ansible/tmp/ansible-tmp-1727204447.1749563-42226-24439077889401 `" ) && sleep 0' 41684 1727204447.19076: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204447.19229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204447.19241: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204447.19244: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204447.19290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204447.19357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204447.19399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204447.19545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41684 1727204447.22122: stdout chunk (state=3): >>>ansible-tmp-1727204447.1749563-42226-24439077889401=/root/.ansible/tmp/ansible-tmp-1727204447.1749563-42226-24439077889401 <<< 41684 1727204447.22607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204447.22611: stdout chunk (state=3): >>><<< 41684 1727204447.22618: stderr chunk (state=3): >>><<< 41684 1727204447.22669: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204447.1749563-42226-24439077889401=/root/.ansible/tmp/ansible-tmp-1727204447.1749563-42226-24439077889401 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41684 1727204447.22673: variable 'ansible_module_compression' from source: unknown 41684 1727204447.23071: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41684 1727204447.23075: variable 'ansible_facts' from source: unknown 41684 1727204447.23079: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204447.1749563-42226-24439077889401/AnsiballZ_setup.py 41684 1727204447.23291: Sending initial data 41684 1727204447.23553: Sent initial data (153 bytes) 41684 1727204447.24537: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204447.24556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204447.24781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204447.24802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204447.24845: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204447.24860: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204447.24879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204447.24897: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204447.24908: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204447.24918: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204447.24929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204447.24942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204447.24964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204447.24984: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204447.24997: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204447.25022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204447.25132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204447.25152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204447.25181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204447.25370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41684 1727204447.27729: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204447.27854: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204447.27858: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpt0l7p30d /root/.ansible/tmp/ansible-tmp-1727204447.1749563-42226-24439077889401/AnsiballZ_setup.py <<< 41684 1727204447.27898: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204447.30891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204447.31142: stderr chunk (state=3): >>><<< 41684 1727204447.31146: stdout chunk (state=3): >>><<< 41684 1727204447.31149: done transferring module to remote 41684 1727204447.31151: _low_level_execute_command(): starting 41684 1727204447.31153: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204447.1749563-42226-24439077889401/ /root/.ansible/tmp/ansible-tmp-1727204447.1749563-42226-24439077889401/AnsiballZ_setup.py && sleep 0' 41684 1727204447.31880: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204447.31926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41684 1727204447.34368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204447.34373: stdout chunk (state=3): >>><<< 41684 1727204447.34375: stderr chunk (state=3): >>><<< 41684 1727204447.34387: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41684 1727204447.34390: _low_level_execute_command(): starting 41684 1727204447.34392: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204447.1749563-42226-24439077889401/AnsiballZ_setup.py && sleep 0' 41684 1727204447.34875: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204447.34880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204447.34908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204447.34911: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204447.34914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204447.34970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204447.34982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204447.35051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41684 1727204448.98334: stdout chunk (state=3): >>> <<< 41684 1727204448.98355: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0ieda<<< 41684 1727204448.98391: stdout chunk (state=3): >>>Uuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", <<< 41684 1727204448.98408: stdout chunk (state=3): >>>"month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "47", "epoch": "1727204447", "epoch_int": "1727204447", "date": "2024-09-24", "time": "15:00:47", "iso8601_micro": "2024-09-24T19:00:47.679344Z", "iso8601": "2024-09-24T19:00:47Z", "iso8601_basic": "20240924T150047679344", "iso8601_basic_short": "20240924T150047", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2794, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 738, "free": 2794}, "nocache": {"free": 3269, "used": 263}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [<<< 41684 1727204448.98602: stdout chunk (state=3): >>>], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 710, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271527936, "block_size": 4096, "block_total": 65519355, "block_available": 64519416, "block_used": 999939, "inode_total": 131071472, "inode_available": 130998226, "inode_used": 73246, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.39, "5m": 0.43, "15m": 0.27}, "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["lo", "rpltstbr", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41684 1727204449.01187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204449.01279: stderr chunk (state=3): >>><<< 41684 1727204449.01282: stdout chunk (state=3): >>><<< 41684 1727204449.01323: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "00", "second": "47", "epoch": "1727204447", "epoch_int": "1727204447", "date": "2024-09-24", "time": "15:00:47", "iso8601_micro": "2024-09-24T19:00:47.679344Z", "iso8601": "2024-09-24T19:00:47Z", "iso8601_basic": "20240924T150047679344", "iso8601_basic_short": "20240924T150047", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2794, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 738, "free": 2794}, "nocache": {"free": 3269, "used": 263}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 710, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271527936, "block_size": 4096, "block_total": 65519355, "block_available": 64519416, "block_used": 999939, "inode_total": 131071472, "inode_available": 130998226, "inode_used": 73246, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.39, "5m": 0.43, "15m": 0.27}, "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["lo", "rpltstbr", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "4a:d1:a2:43:cd:1d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204449.01728: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204447.1749563-42226-24439077889401/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204449.01748: _low_level_execute_command(): starting 41684 1727204449.01753: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204447.1749563-42226-24439077889401/ > /dev/null 2>&1 && sleep 0' 41684 1727204449.03314: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204449.03318: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204449.03320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204449.03323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204449.03325: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204449.03327: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204449.03329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204449.03332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204449.03342: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204449.03354: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204449.03371: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204449.03384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204449.03399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204449.03411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204449.03422: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204449.03435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204449.03515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204449.03538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204449.03555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204449.03655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41684 1727204449.06278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204449.06340: stderr chunk (state=3): >>><<< 41684 1727204449.06344: stdout chunk (state=3): >>><<< 41684 1727204449.06370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41684 1727204449.06574: handler run complete 41684 1727204449.06577: variable 'ansible_facts' from source: unknown 41684 1727204449.06616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204449.07656: variable 'ansible_facts' from source: unknown 41684 1727204449.07776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204449.07930: attempt loop complete, returning result 41684 1727204449.07944: _execute() done 41684 1727204449.07952: dumping result to json 41684 1727204449.08000: done dumping result, returning 41684 1727204449.08014: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [0affcd87-79f5-3839-086d-00000000011b] 41684 1727204449.08025: sending task result for task 0affcd87-79f5-3839-086d-00000000011b ok: [managed-node1] 41684 1727204449.08939: no more pending results, returning what we have 41684 1727204449.08944: results queue empty 41684 1727204449.08945: checking for any_errors_fatal 41684 1727204449.08946: done checking for any_errors_fatal 41684 1727204449.08947: checking for max_fail_percentage 41684 1727204449.08949: done checking for max_fail_percentage 41684 1727204449.08949: checking to see if all hosts have failed and the running result is not ok 41684 1727204449.08950: done checking to see if all hosts have failed 41684 1727204449.08951: getting the remaining hosts for this loop 41684 1727204449.08953: done getting the remaining hosts for this loop 41684 1727204449.08957: getting the next task for host managed-node1 41684 1727204449.08968: done getting next task for host managed-node1 41684 1727204449.08970: ^ task is: TASK: meta (flush_handlers) 41684 1727204449.08972: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204449.08983: getting variables 41684 1727204449.08985: in VariableManager get_vars() 41684 1727204449.09021: Calling all_inventory to load vars for managed-node1 41684 1727204449.09023: Calling groups_inventory to load vars for managed-node1 41684 1727204449.09026: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204449.09036: Calling all_plugins_play to load vars for managed-node1 41684 1727204449.09039: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204449.09041: Calling groups_plugins_play to load vars for managed-node1 41684 1727204449.09227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204449.09851: done sending task result for task 0affcd87-79f5-3839-086d-00000000011b 41684 1727204449.09855: WORKER PROCESS EXITING 41684 1727204449.09897: done with get_vars() 41684 1727204449.09909: done getting variables 41684 1727204449.10211: in VariableManager get_vars() 41684 1727204449.10227: Calling all_inventory to load vars for managed-node1 41684 1727204449.10229: Calling groups_inventory to load vars for managed-node1 41684 1727204449.10231: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204449.10236: Calling all_plugins_play to load vars for managed-node1 41684 1727204449.10238: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204449.10246: Calling groups_plugins_play to load vars for managed-node1 41684 1727204449.10631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204449.11003: done with get_vars() 41684 1727204449.11018: done queuing things up, now waiting for results queue to drain 41684 1727204449.11020: results queue empty 41684 1727204449.11021: checking for any_errors_fatal 41684 1727204449.11025: done checking for any_errors_fatal 41684 1727204449.11026: checking for max_fail_percentage 41684 1727204449.11027: done checking for max_fail_percentage 41684 1727204449.11028: checking to see if all hosts have failed and the running result is not ok 41684 1727204449.11028: done checking to see if all hosts have failed 41684 1727204449.11029: getting the remaining hosts for this loop 41684 1727204449.11030: done getting the remaining hosts for this loop 41684 1727204449.11033: getting the next task for host managed-node1 41684 1727204449.11037: done getting next task for host managed-node1 41684 1727204449.11040: ^ task is: TASK: Set type and interface0 41684 1727204449.11042: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204449.11044: getting variables 41684 1727204449.11160: in VariableManager get_vars() 41684 1727204449.11181: Calling all_inventory to load vars for managed-node1 41684 1727204449.11183: Calling groups_inventory to load vars for managed-node1 41684 1727204449.11185: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204449.11190: Calling all_plugins_play to load vars for managed-node1 41684 1727204449.11192: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204449.11195: Calling groups_plugins_play to load vars for managed-node1 41684 1727204449.11452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204449.11974: done with get_vars() 41684 1727204449.11985: done getting variables 41684 1727204449.12035: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set type and interface0] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:11 Tuesday 24 September 2024 15:00:49 -0400 (0:00:02.027) 0:00:05.522 ***** 41684 1727204449.12070: entering _queue_task() for managed-node1/set_fact 41684 1727204449.12405: worker is 1 (out of 1 available) 41684 1727204449.12416: exiting _queue_task() for managed-node1/set_fact 41684 1727204449.12428: done queuing things up, now waiting for results queue to drain 41684 1727204449.12429: waiting for pending results... 41684 1727204449.12698: running TaskExecutor() for managed-node1/TASK: Set type and interface0 41684 1727204449.12797: in run() - task 0affcd87-79f5-3839-086d-00000000000b 41684 1727204449.12821: variable 'ansible_search_path' from source: unknown 41684 1727204449.12866: calling self._execute() 41684 1727204449.12953: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204449.12968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204449.12985: variable 'omit' from source: magic vars 41684 1727204449.13479: variable 'ansible_distribution_major_version' from source: facts 41684 1727204449.13496: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204449.13507: variable 'omit' from source: magic vars 41684 1727204449.13542: variable 'omit' from source: magic vars 41684 1727204449.13585: variable 'type' from source: play vars 41684 1727204449.13670: variable 'type' from source: play vars 41684 1727204449.13705: variable 'interface0' from source: play vars 41684 1727204449.13783: variable 'interface0' from source: play vars 41684 1727204449.13815: variable 'omit' from source: magic vars 41684 1727204449.13872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204449.14017: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204449.14050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204449.14091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204449.14106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204449.14137: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204449.14145: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204449.14153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204449.14269: Set connection var ansible_connection to ssh 41684 1727204449.14281: Set connection var ansible_pipelining to False 41684 1727204449.14293: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204449.14309: Set connection var ansible_timeout to 10 41684 1727204449.14320: Set connection var ansible_shell_executable to /bin/sh 41684 1727204449.14326: Set connection var ansible_shell_type to sh 41684 1727204449.14353: variable 'ansible_shell_executable' from source: unknown 41684 1727204449.14360: variable 'ansible_connection' from source: unknown 41684 1727204449.14372: variable 'ansible_module_compression' from source: unknown 41684 1727204449.14378: variable 'ansible_shell_type' from source: unknown 41684 1727204449.14384: variable 'ansible_shell_executable' from source: unknown 41684 1727204449.14390: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204449.14397: variable 'ansible_pipelining' from source: unknown 41684 1727204449.14405: variable 'ansible_timeout' from source: unknown 41684 1727204449.14421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204449.14626: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204449.14648: variable 'omit' from source: magic vars 41684 1727204449.14657: starting attempt loop 41684 1727204449.14668: running the handler 41684 1727204449.14684: handler run complete 41684 1727204449.14700: attempt loop complete, returning result 41684 1727204449.14706: _execute() done 41684 1727204449.14712: dumping result to json 41684 1727204449.14718: done dumping result, returning 41684 1727204449.14728: done running TaskExecutor() for managed-node1/TASK: Set type and interface0 [0affcd87-79f5-3839-086d-00000000000b] 41684 1727204449.14738: sending task result for task 0affcd87-79f5-3839-086d-00000000000b 41684 1727204449.14850: done sending task result for task 0affcd87-79f5-3839-086d-00000000000b ok: [managed-node1] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 41684 1727204449.14914: no more pending results, returning what we have 41684 1727204449.14918: results queue empty 41684 1727204449.14918: checking for any_errors_fatal 41684 1727204449.14920: done checking for any_errors_fatal 41684 1727204449.14921: checking for max_fail_percentage 41684 1727204449.14923: done checking for max_fail_percentage 41684 1727204449.14924: checking to see if all hosts have failed and the running result is not ok 41684 1727204449.14924: done checking to see if all hosts have failed 41684 1727204449.14925: getting the remaining hosts for this loop 41684 1727204449.14927: done getting the remaining hosts for this loop 41684 1727204449.14931: getting the next task for host managed-node1 41684 1727204449.14937: done getting next task for host managed-node1 41684 1727204449.14940: ^ task is: TASK: Show interfaces 41684 1727204449.14942: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204449.14945: getting variables 41684 1727204449.14947: in VariableManager get_vars() 41684 1727204449.15024: Calling all_inventory to load vars for managed-node1 41684 1727204449.15027: Calling groups_inventory to load vars for managed-node1 41684 1727204449.15030: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204449.15042: Calling all_plugins_play to load vars for managed-node1 41684 1727204449.15045: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204449.15048: Calling groups_plugins_play to load vars for managed-node1 41684 1727204449.15338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204449.15702: done with get_vars() 41684 1727204449.15714: done getting variables 41684 1727204449.15867: WORKER PROCESS EXITING TASK [Show interfaces] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:15 Tuesday 24 September 2024 15:00:49 -0400 (0:00:00.038) 0:00:05.560 ***** 41684 1727204449.15933: entering _queue_task() for managed-node1/include_tasks 41684 1727204449.16268: worker is 1 (out of 1 available) 41684 1727204449.16280: exiting _queue_task() for managed-node1/include_tasks 41684 1727204449.16298: done queuing things up, now waiting for results queue to drain 41684 1727204449.16300: waiting for pending results... 41684 1727204449.16556: running TaskExecutor() for managed-node1/TASK: Show interfaces 41684 1727204449.16704: in run() - task 0affcd87-79f5-3839-086d-00000000000c 41684 1727204449.16734: variable 'ansible_search_path' from source: unknown 41684 1727204449.16820: calling self._execute() 41684 1727204449.16916: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204449.16927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204449.16948: variable 'omit' from source: magic vars 41684 1727204449.17331: variable 'ansible_distribution_major_version' from source: facts 41684 1727204449.17351: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204449.17368: _execute() done 41684 1727204449.17385: dumping result to json 41684 1727204449.17397: done dumping result, returning 41684 1727204449.17408: done running TaskExecutor() for managed-node1/TASK: Show interfaces [0affcd87-79f5-3839-086d-00000000000c] 41684 1727204449.17419: sending task result for task 0affcd87-79f5-3839-086d-00000000000c 41684 1727204449.17566: no more pending results, returning what we have 41684 1727204449.17573: in VariableManager get_vars() 41684 1727204449.17628: Calling all_inventory to load vars for managed-node1 41684 1727204449.17632: Calling groups_inventory to load vars for managed-node1 41684 1727204449.17637: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204449.17653: Calling all_plugins_play to load vars for managed-node1 41684 1727204449.17656: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204449.17660: Calling groups_plugins_play to load vars for managed-node1 41684 1727204449.17889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204449.18123: done with get_vars() 41684 1727204449.18132: variable 'ansible_search_path' from source: unknown 41684 1727204449.18150: we have included files to process 41684 1727204449.18151: generating all_blocks data 41684 1727204449.18153: done generating all_blocks data 41684 1727204449.18154: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41684 1727204449.18155: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41684 1727204449.18158: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41684 1727204449.18582: in VariableManager get_vars() 41684 1727204449.18952: done sending task result for task 0affcd87-79f5-3839-086d-00000000000c 41684 1727204449.18955: WORKER PROCESS EXITING 41684 1727204449.18985: done with get_vars() 41684 1727204449.19107: done processing included file 41684 1727204449.19108: iterating over new_blocks loaded from include file 41684 1727204449.19110: in VariableManager get_vars() 41684 1727204449.19126: done with get_vars() 41684 1727204449.19127: filtering new block on tags 41684 1727204449.19144: done filtering new block on tags 41684 1727204449.19146: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 41684 1727204449.19151: extending task lists for all hosts with included blocks 41684 1727204449.19321: done extending task lists 41684 1727204449.19322: done processing included files 41684 1727204449.19323: results queue empty 41684 1727204449.19324: checking for any_errors_fatal 41684 1727204449.19327: done checking for any_errors_fatal 41684 1727204449.19328: checking for max_fail_percentage 41684 1727204449.19329: done checking for max_fail_percentage 41684 1727204449.19329: checking to see if all hosts have failed and the running result is not ok 41684 1727204449.19330: done checking to see if all hosts have failed 41684 1727204449.19331: getting the remaining hosts for this loop 41684 1727204449.19332: done getting the remaining hosts for this loop 41684 1727204449.19334: getting the next task for host managed-node1 41684 1727204449.19338: done getting next task for host managed-node1 41684 1727204449.19340: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 41684 1727204449.19342: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204449.19344: getting variables 41684 1727204449.19345: in VariableManager get_vars() 41684 1727204449.19358: Calling all_inventory to load vars for managed-node1 41684 1727204449.19360: Calling groups_inventory to load vars for managed-node1 41684 1727204449.19366: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204449.19372: Calling all_plugins_play to load vars for managed-node1 41684 1727204449.19374: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204449.19377: Calling groups_plugins_play to load vars for managed-node1 41684 1727204449.19536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204449.19766: done with get_vars() 41684 1727204449.19775: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:00:49 -0400 (0:00:00.039) 0:00:05.600 ***** 41684 1727204449.19867: entering _queue_task() for managed-node1/include_tasks 41684 1727204449.20244: worker is 1 (out of 1 available) 41684 1727204449.20269: exiting _queue_task() for managed-node1/include_tasks 41684 1727204449.20282: done queuing things up, now waiting for results queue to drain 41684 1727204449.20283: waiting for pending results... 41684 1727204449.20566: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 41684 1727204449.20690: in run() - task 0affcd87-79f5-3839-086d-000000000135 41684 1727204449.20712: variable 'ansible_search_path' from source: unknown 41684 1727204449.20720: variable 'ansible_search_path' from source: unknown 41684 1727204449.20768: calling self._execute() 41684 1727204449.20858: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204449.20872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204449.20885: variable 'omit' from source: magic vars 41684 1727204449.21296: variable 'ansible_distribution_major_version' from source: facts 41684 1727204449.21313: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204449.21323: _execute() done 41684 1727204449.21331: dumping result to json 41684 1727204449.21338: done dumping result, returning 41684 1727204449.21348: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-3839-086d-000000000135] 41684 1727204449.21369: sending task result for task 0affcd87-79f5-3839-086d-000000000135 41684 1727204449.21463: done sending task result for task 0affcd87-79f5-3839-086d-000000000135 41684 1727204449.21476: WORKER PROCESS EXITING 41684 1727204449.21510: no more pending results, returning what we have 41684 1727204449.21516: in VariableManager get_vars() 41684 1727204449.21573: Calling all_inventory to load vars for managed-node1 41684 1727204449.21578: Calling groups_inventory to load vars for managed-node1 41684 1727204449.21581: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204449.21596: Calling all_plugins_play to load vars for managed-node1 41684 1727204449.21600: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204449.21604: Calling groups_plugins_play to load vars for managed-node1 41684 1727204449.21874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204449.22125: done with get_vars() 41684 1727204449.22134: variable 'ansible_search_path' from source: unknown 41684 1727204449.22135: variable 'ansible_search_path' from source: unknown 41684 1727204449.22252: we have included files to process 41684 1727204449.22254: generating all_blocks data 41684 1727204449.22255: done generating all_blocks data 41684 1727204449.22256: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41684 1727204449.22257: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41684 1727204449.22260: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41684 1727204449.22877: done processing included file 41684 1727204449.22879: iterating over new_blocks loaded from include file 41684 1727204449.22881: in VariableManager get_vars() 41684 1727204449.22901: done with get_vars() 41684 1727204449.22903: filtering new block on tags 41684 1727204449.22919: done filtering new block on tags 41684 1727204449.22922: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 41684 1727204449.22928: extending task lists for all hosts with included blocks 41684 1727204449.23153: done extending task lists 41684 1727204449.23154: done processing included files 41684 1727204449.23155: results queue empty 41684 1727204449.23156: checking for any_errors_fatal 41684 1727204449.23160: done checking for any_errors_fatal 41684 1727204449.23160: checking for max_fail_percentage 41684 1727204449.23167: done checking for max_fail_percentage 41684 1727204449.23167: checking to see if all hosts have failed and the running result is not ok 41684 1727204449.23168: done checking to see if all hosts have failed 41684 1727204449.23169: getting the remaining hosts for this loop 41684 1727204449.23170: done getting the remaining hosts for this loop 41684 1727204449.23173: getting the next task for host managed-node1 41684 1727204449.23177: done getting next task for host managed-node1 41684 1727204449.23180: ^ task is: TASK: Gather current interface info 41684 1727204449.23182: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204449.23184: getting variables 41684 1727204449.23185: in VariableManager get_vars() 41684 1727204449.23316: Calling all_inventory to load vars for managed-node1 41684 1727204449.23319: Calling groups_inventory to load vars for managed-node1 41684 1727204449.23321: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204449.23326: Calling all_plugins_play to load vars for managed-node1 41684 1727204449.23329: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204449.23331: Calling groups_plugins_play to load vars for managed-node1 41684 1727204449.23601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204449.24048: done with get_vars() 41684 1727204449.24058: done getting variables 41684 1727204449.24108: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:00:49 -0400 (0:00:00.042) 0:00:05.642 ***** 41684 1727204449.24136: entering _queue_task() for managed-node1/command 41684 1727204449.24454: worker is 1 (out of 1 available) 41684 1727204449.24470: exiting _queue_task() for managed-node1/command 41684 1727204449.24483: done queuing things up, now waiting for results queue to drain 41684 1727204449.24485: waiting for pending results... 41684 1727204449.24742: running TaskExecutor() for managed-node1/TASK: Gather current interface info 41684 1727204449.24858: in run() - task 0affcd87-79f5-3839-086d-00000000014e 41684 1727204449.24883: variable 'ansible_search_path' from source: unknown 41684 1727204449.24890: variable 'ansible_search_path' from source: unknown 41684 1727204449.24935: calling self._execute() 41684 1727204449.25027: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204449.25042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204449.25066: variable 'omit' from source: magic vars 41684 1727204449.25452: variable 'ansible_distribution_major_version' from source: facts 41684 1727204449.25484: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204449.25498: variable 'omit' from source: magic vars 41684 1727204449.25546: variable 'omit' from source: magic vars 41684 1727204449.25598: variable 'omit' from source: magic vars 41684 1727204449.25646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204449.25696: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204449.25730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204449.25753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204449.25775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204449.25821: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204449.25835: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204449.25843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204449.25956: Set connection var ansible_connection to ssh 41684 1727204449.25972: Set connection var ansible_pipelining to False 41684 1727204449.25982: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204449.25992: Set connection var ansible_timeout to 10 41684 1727204449.26002: Set connection var ansible_shell_executable to /bin/sh 41684 1727204449.26009: Set connection var ansible_shell_type to sh 41684 1727204449.26048: variable 'ansible_shell_executable' from source: unknown 41684 1727204449.26056: variable 'ansible_connection' from source: unknown 41684 1727204449.26068: variable 'ansible_module_compression' from source: unknown 41684 1727204449.26075: variable 'ansible_shell_type' from source: unknown 41684 1727204449.26082: variable 'ansible_shell_executable' from source: unknown 41684 1727204449.26087: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204449.26094: variable 'ansible_pipelining' from source: unknown 41684 1727204449.26099: variable 'ansible_timeout' from source: unknown 41684 1727204449.26106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204449.26276: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204449.26293: variable 'omit' from source: magic vars 41684 1727204449.26303: starting attempt loop 41684 1727204449.26309: running the handler 41684 1727204449.26328: _low_level_execute_command(): starting 41684 1727204449.26345: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204449.27171: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204449.27187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204449.27201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204449.27221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204449.27283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204449.27294: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204449.27308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204449.27325: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204449.27344: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204449.27357: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204449.27375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204449.27389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204449.27403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204449.27414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204449.27424: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204449.27437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204449.27526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204449.27547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204449.27575: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204449.27683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41684 1727204449.29941: stdout chunk (state=3): >>>/root <<< 41684 1727204449.30188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204449.30319: stderr chunk (state=3): >>><<< 41684 1727204449.30322: stdout chunk (state=3): >>><<< 41684 1727204449.30371: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41684 1727204449.30376: _low_level_execute_command(): starting 41684 1727204449.30379: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204449.3034365-42297-62429360963251 `" && echo ansible-tmp-1727204449.3034365-42297-62429360963251="` echo /root/.ansible/tmp/ansible-tmp-1727204449.3034365-42297-62429360963251 `" ) && sleep 0' 41684 1727204449.31659: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204449.31668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204449.31681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204449.31706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204449.31746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204449.31753: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204449.31767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204449.31778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204449.31785: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204449.31792: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204449.31804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204449.31817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204449.31827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204449.31834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204449.31840: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204449.31848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204449.31929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204449.31945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204449.31948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204449.32044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41684 1727204449.34215: stdout chunk (state=3): >>>ansible-tmp-1727204449.3034365-42297-62429360963251=/root/.ansible/tmp/ansible-tmp-1727204449.3034365-42297-62429360963251 <<< 41684 1727204449.34379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204449.34424: stderr chunk (state=3): >>><<< 41684 1727204449.34427: stdout chunk (state=3): >>><<< 41684 1727204449.34448: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204449.3034365-42297-62429360963251=/root/.ansible/tmp/ansible-tmp-1727204449.3034365-42297-62429360963251 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41684 1727204449.34487: variable 'ansible_module_compression' from source: unknown 41684 1727204449.34562: ANSIBALLZ: Using generic lock for ansible.legacy.command 41684 1727204449.34568: ANSIBALLZ: Acquiring lock 41684 1727204449.34570: ANSIBALLZ: Lock acquired: 139842516808240 41684 1727204449.34572: ANSIBALLZ: Creating module 41684 1727204449.52204: ANSIBALLZ: Writing module into payload 41684 1727204449.52347: ANSIBALLZ: Writing module 41684 1727204449.52381: ANSIBALLZ: Renaming module 41684 1727204449.52394: ANSIBALLZ: Done creating module 41684 1727204449.52422: variable 'ansible_facts' from source: unknown 41684 1727204449.52516: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204449.3034365-42297-62429360963251/AnsiballZ_command.py 41684 1727204449.53104: Sending initial data 41684 1727204449.53108: Sent initial data (155 bytes) 41684 1727204449.54683: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204449.54721: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204449.54726: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204449.54744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204449.54748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204449.54825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204449.54828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204449.54843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204449.54925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204449.57229: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204449.57297: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204449.57352: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmp63gosduo /root/.ansible/tmp/ansible-tmp-1727204449.3034365-42297-62429360963251/AnsiballZ_command.py <<< 41684 1727204449.57406: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204449.59288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204449.59473: stderr chunk (state=3): >>><<< 41684 1727204449.59478: stdout chunk (state=3): >>><<< 41684 1727204449.59480: done transferring module to remote 41684 1727204449.59574: _low_level_execute_command(): starting 41684 1727204449.59579: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204449.3034365-42297-62429360963251/ /root/.ansible/tmp/ansible-tmp-1727204449.3034365-42297-62429360963251/AnsiballZ_command.py && sleep 0' 41684 1727204449.60524: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204449.60528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204449.60548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 41684 1727204449.60551: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204449.60635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204449.60649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204449.60747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41684 1727204449.63245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204449.63249: stdout chunk (state=3): >>><<< 41684 1727204449.63251: stderr chunk (state=3): >>><<< 41684 1727204449.63347: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41684 1727204449.63350: _low_level_execute_command(): starting 41684 1727204449.63353: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204449.3034365-42297-62429360963251/AnsiballZ_command.py && sleep 0' 41684 1727204449.64509: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204449.64513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204449.64547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204449.64550: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204449.64552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204449.64630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204449.64634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204449.64718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41684 1727204449.85395: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:49.848582", "end": "2024-09-24 15:00:49.853059", "delta": "0:00:00.004477", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204449.86896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204449.86936: stderr chunk (state=3): >>><<< 41684 1727204449.86939: stdout chunk (state=3): >>><<< 41684 1727204449.87094: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:49.848582", "end": "2024-09-24 15:00:49.853059", "delta": "0:00:00.004477", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204449.87098: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204449.3034365-42297-62429360963251/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204449.87101: _low_level_execute_command(): starting 41684 1727204449.87107: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204449.3034365-42297-62429360963251/ > /dev/null 2>&1 && sleep 0' 41684 1727204449.87716: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204449.87732: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204449.87754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204449.87785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204449.87826: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204449.87837: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204449.87850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204449.87906: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204449.87921: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204449.87956: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204449.87983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204449.88000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204449.88017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204449.88030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204449.88043: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204449.88058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204449.88155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204449.88182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204449.88209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204449.88292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41684 1727204449.91186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204449.91293: stderr chunk (state=3): >>><<< 41684 1727204449.91299: stdout chunk (state=3): >>><<< 41684 1727204449.91371: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41684 1727204449.91374: handler run complete 41684 1727204449.91376: Evaluated conditional (False): False 41684 1727204449.91378: attempt loop complete, returning result 41684 1727204449.91380: _execute() done 41684 1727204449.91382: dumping result to json 41684 1727204449.91682: done dumping result, returning 41684 1727204449.91685: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [0affcd87-79f5-3839-086d-00000000014e] 41684 1727204449.91688: sending task result for task 0affcd87-79f5-3839-086d-00000000014e 41684 1727204449.91758: done sending task result for task 0affcd87-79f5-3839-086d-00000000014e 41684 1727204449.91761: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004477", "end": "2024-09-24 15:00:49.853059", "rc": 0, "start": "2024-09-24 15:00:49.848582" } STDOUT: bonding_masters eth0 lo rpltstbr 41684 1727204449.91882: no more pending results, returning what we have 41684 1727204449.91886: results queue empty 41684 1727204449.91887: checking for any_errors_fatal 41684 1727204449.91889: done checking for any_errors_fatal 41684 1727204449.91890: checking for max_fail_percentage 41684 1727204449.91892: done checking for max_fail_percentage 41684 1727204449.91892: checking to see if all hosts have failed and the running result is not ok 41684 1727204449.91893: done checking to see if all hosts have failed 41684 1727204449.91894: getting the remaining hosts for this loop 41684 1727204449.91896: done getting the remaining hosts for this loop 41684 1727204449.91900: getting the next task for host managed-node1 41684 1727204449.91907: done getting next task for host managed-node1 41684 1727204449.91910: ^ task is: TASK: Set current_interfaces 41684 1727204449.91915: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204449.91918: getting variables 41684 1727204449.91919: in VariableManager get_vars() 41684 1727204449.91968: Calling all_inventory to load vars for managed-node1 41684 1727204449.91971: Calling groups_inventory to load vars for managed-node1 41684 1727204449.91974: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204449.91985: Calling all_plugins_play to load vars for managed-node1 41684 1727204449.91987: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204449.91990: Calling groups_plugins_play to load vars for managed-node1 41684 1727204449.92185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204449.92416: done with get_vars() 41684 1727204449.92429: done getting variables 41684 1727204449.92646: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:00:49 -0400 (0:00:00.685) 0:00:06.328 ***** 41684 1727204449.92681: entering _queue_task() for managed-node1/set_fact 41684 1727204449.92977: worker is 1 (out of 1 available) 41684 1727204449.92990: exiting _queue_task() for managed-node1/set_fact 41684 1727204449.93004: done queuing things up, now waiting for results queue to drain 41684 1727204449.93005: waiting for pending results... 41684 1727204449.93285: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 41684 1727204449.93395: in run() - task 0affcd87-79f5-3839-086d-00000000014f 41684 1727204449.93415: variable 'ansible_search_path' from source: unknown 41684 1727204449.93423: variable 'ansible_search_path' from source: unknown 41684 1727204449.93471: calling self._execute() 41684 1727204449.93555: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204449.93572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204449.93586: variable 'omit' from source: magic vars 41684 1727204449.93943: variable 'ansible_distribution_major_version' from source: facts 41684 1727204449.93960: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204449.93975: variable 'omit' from source: magic vars 41684 1727204449.94026: variable 'omit' from source: magic vars 41684 1727204449.94140: variable '_current_interfaces' from source: set_fact 41684 1727204449.94209: variable 'omit' from source: magic vars 41684 1727204449.94326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204449.94372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204449.94397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204449.94424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204449.94443: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204449.94478: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204449.94487: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204449.94493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204449.94595: Set connection var ansible_connection to ssh 41684 1727204449.94606: Set connection var ansible_pipelining to False 41684 1727204449.94615: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204449.94624: Set connection var ansible_timeout to 10 41684 1727204449.94633: Set connection var ansible_shell_executable to /bin/sh 41684 1727204449.94639: Set connection var ansible_shell_type to sh 41684 1727204449.94676: variable 'ansible_shell_executable' from source: unknown 41684 1727204449.94684: variable 'ansible_connection' from source: unknown 41684 1727204449.94690: variable 'ansible_module_compression' from source: unknown 41684 1727204449.94696: variable 'ansible_shell_type' from source: unknown 41684 1727204449.94701: variable 'ansible_shell_executable' from source: unknown 41684 1727204449.94707: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204449.94713: variable 'ansible_pipelining' from source: unknown 41684 1727204449.94718: variable 'ansible_timeout' from source: unknown 41684 1727204449.94724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204449.94857: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204449.94880: variable 'omit' from source: magic vars 41684 1727204449.94889: starting attempt loop 41684 1727204449.94894: running the handler 41684 1727204449.94908: handler run complete 41684 1727204449.94920: attempt loop complete, returning result 41684 1727204449.94926: _execute() done 41684 1727204449.94932: dumping result to json 41684 1727204449.94938: done dumping result, returning 41684 1727204449.94948: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [0affcd87-79f5-3839-086d-00000000014f] 41684 1727204449.94956: sending task result for task 0affcd87-79f5-3839-086d-00000000014f 41684 1727204449.95057: done sending task result for task 0affcd87-79f5-3839-086d-00000000014f 41684 1727204449.95069: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 41684 1727204449.95139: no more pending results, returning what we have 41684 1727204449.95145: results queue empty 41684 1727204449.95146: checking for any_errors_fatal 41684 1727204449.95155: done checking for any_errors_fatal 41684 1727204449.95156: checking for max_fail_percentage 41684 1727204449.95157: done checking for max_fail_percentage 41684 1727204449.95158: checking to see if all hosts have failed and the running result is not ok 41684 1727204449.95158: done checking to see if all hosts have failed 41684 1727204449.95159: getting the remaining hosts for this loop 41684 1727204449.95161: done getting the remaining hosts for this loop 41684 1727204449.95170: getting the next task for host managed-node1 41684 1727204449.95179: done getting next task for host managed-node1 41684 1727204449.95182: ^ task is: TASK: Show current_interfaces 41684 1727204449.95185: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204449.95189: getting variables 41684 1727204449.95190: in VariableManager get_vars() 41684 1727204449.95232: Calling all_inventory to load vars for managed-node1 41684 1727204449.95235: Calling groups_inventory to load vars for managed-node1 41684 1727204449.95237: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204449.95250: Calling all_plugins_play to load vars for managed-node1 41684 1727204449.95253: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204449.95255: Calling groups_plugins_play to load vars for managed-node1 41684 1727204449.95494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204449.95723: done with get_vars() 41684 1727204449.95735: done getting variables 41684 1727204449.95834: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:00:49 -0400 (0:00:00.031) 0:00:06.360 ***** 41684 1727204449.95870: entering _queue_task() for managed-node1/debug 41684 1727204449.95872: Creating lock for debug 41684 1727204449.96378: worker is 1 (out of 1 available) 41684 1727204449.96390: exiting _queue_task() for managed-node1/debug 41684 1727204449.96402: done queuing things up, now waiting for results queue to drain 41684 1727204449.96404: waiting for pending results... 41684 1727204449.96651: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 41684 1727204449.96758: in run() - task 0affcd87-79f5-3839-086d-000000000136 41684 1727204449.96782: variable 'ansible_search_path' from source: unknown 41684 1727204449.96790: variable 'ansible_search_path' from source: unknown 41684 1727204449.96828: calling self._execute() 41684 1727204449.96916: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204449.96926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204449.96938: variable 'omit' from source: magic vars 41684 1727204449.97303: variable 'ansible_distribution_major_version' from source: facts 41684 1727204449.97320: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204449.97330: variable 'omit' from source: magic vars 41684 1727204449.97372: variable 'omit' from source: magic vars 41684 1727204449.97479: variable 'current_interfaces' from source: set_fact 41684 1727204449.97516: variable 'omit' from source: magic vars 41684 1727204449.97565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204449.97609: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204449.97635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204449.97659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204449.97679: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204449.97716: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204449.97725: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204449.97732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204449.97838: Set connection var ansible_connection to ssh 41684 1727204449.97850: Set connection var ansible_pipelining to False 41684 1727204449.97859: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204449.97874: Set connection var ansible_timeout to 10 41684 1727204449.97886: Set connection var ansible_shell_executable to /bin/sh 41684 1727204449.97893: Set connection var ansible_shell_type to sh 41684 1727204449.97920: variable 'ansible_shell_executable' from source: unknown 41684 1727204449.97932: variable 'ansible_connection' from source: unknown 41684 1727204449.97939: variable 'ansible_module_compression' from source: unknown 41684 1727204449.97944: variable 'ansible_shell_type' from source: unknown 41684 1727204449.97950: variable 'ansible_shell_executable' from source: unknown 41684 1727204449.97956: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204449.97967: variable 'ansible_pipelining' from source: unknown 41684 1727204449.97974: variable 'ansible_timeout' from source: unknown 41684 1727204449.97982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204449.98122: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204449.98137: variable 'omit' from source: magic vars 41684 1727204449.98150: starting attempt loop 41684 1727204449.98156: running the handler 41684 1727204449.98200: handler run complete 41684 1727204449.98215: attempt loop complete, returning result 41684 1727204449.98220: _execute() done 41684 1727204449.98226: dumping result to json 41684 1727204449.98231: done dumping result, returning 41684 1727204449.98239: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [0affcd87-79f5-3839-086d-000000000136] 41684 1727204449.98246: sending task result for task 0affcd87-79f5-3839-086d-000000000136 41684 1727204449.98341: done sending task result for task 0affcd87-79f5-3839-086d-000000000136 41684 1727204449.98346: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 41684 1727204449.98416: no more pending results, returning what we have 41684 1727204449.98421: results queue empty 41684 1727204449.98422: checking for any_errors_fatal 41684 1727204449.98429: done checking for any_errors_fatal 41684 1727204449.98430: checking for max_fail_percentage 41684 1727204449.98431: done checking for max_fail_percentage 41684 1727204449.98432: checking to see if all hosts have failed and the running result is not ok 41684 1727204449.98433: done checking to see if all hosts have failed 41684 1727204449.98434: getting the remaining hosts for this loop 41684 1727204449.98436: done getting the remaining hosts for this loop 41684 1727204449.98442: getting the next task for host managed-node1 41684 1727204449.98451: done getting next task for host managed-node1 41684 1727204449.98454: ^ task is: TASK: Manage test interface 41684 1727204449.98456: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204449.98459: getting variables 41684 1727204449.98461: in VariableManager get_vars() 41684 1727204449.98506: Calling all_inventory to load vars for managed-node1 41684 1727204449.98510: Calling groups_inventory to load vars for managed-node1 41684 1727204449.98512: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204449.98523: Calling all_plugins_play to load vars for managed-node1 41684 1727204449.98526: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204449.98528: Calling groups_plugins_play to load vars for managed-node1 41684 1727204449.98713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204449.98916: done with get_vars() 41684 1727204449.98927: done getting variables TASK [Manage test interface] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:17 Tuesday 24 September 2024 15:00:49 -0400 (0:00:00.031) 0:00:06.391 ***** 41684 1727204449.99024: entering _queue_task() for managed-node1/include_tasks 41684 1727204449.99514: worker is 1 (out of 1 available) 41684 1727204449.99526: exiting _queue_task() for managed-node1/include_tasks 41684 1727204449.99538: done queuing things up, now waiting for results queue to drain 41684 1727204449.99540: waiting for pending results... 41684 1727204450.00023: running TaskExecutor() for managed-node1/TASK: Manage test interface 41684 1727204450.00160: in run() - task 0affcd87-79f5-3839-086d-00000000000d 41684 1727204450.00287: variable 'ansible_search_path' from source: unknown 41684 1727204450.00345: calling self._execute() 41684 1727204450.00501: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204450.00617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204450.00638: variable 'omit' from source: magic vars 41684 1727204450.01537: variable 'ansible_distribution_major_version' from source: facts 41684 1727204450.01631: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204450.01643: _execute() done 41684 1727204450.01651: dumping result to json 41684 1727204450.01656: done dumping result, returning 41684 1727204450.01670: done running TaskExecutor() for managed-node1/TASK: Manage test interface [0affcd87-79f5-3839-086d-00000000000d] 41684 1727204450.01680: sending task result for task 0affcd87-79f5-3839-086d-00000000000d 41684 1727204450.01809: no more pending results, returning what we have 41684 1727204450.01814: in VariableManager get_vars() 41684 1727204450.01868: Calling all_inventory to load vars for managed-node1 41684 1727204450.01871: Calling groups_inventory to load vars for managed-node1 41684 1727204450.01874: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204450.01889: Calling all_plugins_play to load vars for managed-node1 41684 1727204450.01891: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204450.01895: Calling groups_plugins_play to load vars for managed-node1 41684 1727204450.02145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204450.02369: done with get_vars() 41684 1727204450.02378: variable 'ansible_search_path' from source: unknown 41684 1727204450.02394: we have included files to process 41684 1727204450.02395: generating all_blocks data 41684 1727204450.02398: done generating all_blocks data 41684 1727204450.02403: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 41684 1727204450.02404: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 41684 1727204450.02407: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 41684 1727204450.03586: done sending task result for task 0affcd87-79f5-3839-086d-00000000000d 41684 1727204450.03590: WORKER PROCESS EXITING 41684 1727204450.03868: in VariableManager get_vars() 41684 1727204450.03892: done with get_vars() 41684 1727204450.04887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 41684 1727204450.06335: done processing included file 41684 1727204450.06337: iterating over new_blocks loaded from include file 41684 1727204450.06339: in VariableManager get_vars() 41684 1727204450.06358: done with get_vars() 41684 1727204450.06360: filtering new block on tags 41684 1727204450.06393: done filtering new block on tags 41684 1727204450.06395: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node1 41684 1727204450.06401: extending task lists for all hosts with included blocks 41684 1727204450.06606: done extending task lists 41684 1727204450.06608: done processing included files 41684 1727204450.06609: results queue empty 41684 1727204450.06610: checking for any_errors_fatal 41684 1727204450.06613: done checking for any_errors_fatal 41684 1727204450.06614: checking for max_fail_percentage 41684 1727204450.06615: done checking for max_fail_percentage 41684 1727204450.06615: checking to see if all hosts have failed and the running result is not ok 41684 1727204450.06616: done checking to see if all hosts have failed 41684 1727204450.06617: getting the remaining hosts for this loop 41684 1727204450.06618: done getting the remaining hosts for this loop 41684 1727204450.06620: getting the next task for host managed-node1 41684 1727204450.06623: done getting next task for host managed-node1 41684 1727204450.06626: ^ task is: TASK: Ensure state in ["present", "absent"] 41684 1727204450.06628: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204450.06630: getting variables 41684 1727204450.06631: in VariableManager get_vars() 41684 1727204450.06644: Calling all_inventory to load vars for managed-node1 41684 1727204450.06647: Calling groups_inventory to load vars for managed-node1 41684 1727204450.06648: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204450.06653: Calling all_plugins_play to load vars for managed-node1 41684 1727204450.06655: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204450.06658: Calling groups_plugins_play to load vars for managed-node1 41684 1727204450.07552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204450.07799: done with get_vars() 41684 1727204450.07810: done getting variables 41684 1727204450.07885: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 15:00:50 -0400 (0:00:00.088) 0:00:06.480 ***** 41684 1727204450.07914: entering _queue_task() for managed-node1/fail 41684 1727204450.07916: Creating lock for fail 41684 1727204450.08799: worker is 1 (out of 1 available) 41684 1727204450.08810: exiting _queue_task() for managed-node1/fail 41684 1727204450.08828: done queuing things up, now waiting for results queue to drain 41684 1727204450.08830: waiting for pending results... 41684 1727204450.09099: running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] 41684 1727204450.09221: in run() - task 0affcd87-79f5-3839-086d-00000000016a 41684 1727204450.09233: variable 'ansible_search_path' from source: unknown 41684 1727204450.09236: variable 'ansible_search_path' from source: unknown 41684 1727204450.09290: calling self._execute() 41684 1727204450.09387: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204450.09399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204450.09414: variable 'omit' from source: magic vars 41684 1727204450.09923: variable 'ansible_distribution_major_version' from source: facts 41684 1727204450.09948: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204450.10185: variable 'state' from source: include params 41684 1727204450.10222: Evaluated conditional (state not in ["present", "absent"]): False 41684 1727204450.10256: when evaluation is False, skipping this task 41684 1727204450.10271: _execute() done 41684 1727204450.10284: dumping result to json 41684 1727204450.10291: done dumping result, returning 41684 1727204450.10301: done running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] [0affcd87-79f5-3839-086d-00000000016a] 41684 1727204450.10311: sending task result for task 0affcd87-79f5-3839-086d-00000000016a skipping: [managed-node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 41684 1727204450.10458: no more pending results, returning what we have 41684 1727204450.10463: results queue empty 41684 1727204450.10466: checking for any_errors_fatal 41684 1727204450.10468: done checking for any_errors_fatal 41684 1727204450.10468: checking for max_fail_percentage 41684 1727204450.10470: done checking for max_fail_percentage 41684 1727204450.10471: checking to see if all hosts have failed and the running result is not ok 41684 1727204450.10472: done checking to see if all hosts have failed 41684 1727204450.10472: getting the remaining hosts for this loop 41684 1727204450.10474: done getting the remaining hosts for this loop 41684 1727204450.10479: getting the next task for host managed-node1 41684 1727204450.10485: done getting next task for host managed-node1 41684 1727204450.10488: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 41684 1727204450.10491: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204450.10495: getting variables 41684 1727204450.10497: in VariableManager get_vars() 41684 1727204450.10544: Calling all_inventory to load vars for managed-node1 41684 1727204450.10547: Calling groups_inventory to load vars for managed-node1 41684 1727204450.10550: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204450.10571: Calling all_plugins_play to load vars for managed-node1 41684 1727204450.10575: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204450.10579: Calling groups_plugins_play to load vars for managed-node1 41684 1727204450.10797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204450.11022: done with get_vars() 41684 1727204450.11034: done getting variables 41684 1727204450.11225: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204450.11251: done sending task result for task 0affcd87-79f5-3839-086d-00000000016a 41684 1727204450.11254: WORKER PROCESS EXITING TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 15:00:50 -0400 (0:00:00.033) 0:00:06.514 ***** 41684 1727204450.11271: entering _queue_task() for managed-node1/fail 41684 1727204450.11824: worker is 1 (out of 1 available) 41684 1727204450.11837: exiting _queue_task() for managed-node1/fail 41684 1727204450.11849: done queuing things up, now waiting for results queue to drain 41684 1727204450.11851: waiting for pending results... 41684 1727204450.12160: running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] 41684 1727204450.12324: in run() - task 0affcd87-79f5-3839-086d-00000000016b 41684 1727204450.12367: variable 'ansible_search_path' from source: unknown 41684 1727204450.12376: variable 'ansible_search_path' from source: unknown 41684 1727204450.12417: calling self._execute() 41684 1727204450.12566: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204450.12579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204450.12594: variable 'omit' from source: magic vars 41684 1727204450.13086: variable 'ansible_distribution_major_version' from source: facts 41684 1727204450.13138: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204450.13307: variable 'type' from source: set_fact 41684 1727204450.13397: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 41684 1727204450.13405: when evaluation is False, skipping this task 41684 1727204450.13412: _execute() done 41684 1727204450.13419: dumping result to json 41684 1727204450.13425: done dumping result, returning 41684 1727204450.13439: done running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] [0affcd87-79f5-3839-086d-00000000016b] 41684 1727204450.13450: sending task result for task 0affcd87-79f5-3839-086d-00000000016b skipping: [managed-node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 41684 1727204450.13594: no more pending results, returning what we have 41684 1727204450.13598: results queue empty 41684 1727204450.13600: checking for any_errors_fatal 41684 1727204450.13606: done checking for any_errors_fatal 41684 1727204450.13607: checking for max_fail_percentage 41684 1727204450.13608: done checking for max_fail_percentage 41684 1727204450.13609: checking to see if all hosts have failed and the running result is not ok 41684 1727204450.13610: done checking to see if all hosts have failed 41684 1727204450.13611: getting the remaining hosts for this loop 41684 1727204450.13613: done getting the remaining hosts for this loop 41684 1727204450.13617: getting the next task for host managed-node1 41684 1727204450.13624: done getting next task for host managed-node1 41684 1727204450.13628: ^ task is: TASK: Include the task 'show_interfaces.yml' 41684 1727204450.13631: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204450.13634: getting variables 41684 1727204450.13636: in VariableManager get_vars() 41684 1727204450.13684: Calling all_inventory to load vars for managed-node1 41684 1727204450.13687: Calling groups_inventory to load vars for managed-node1 41684 1727204450.13690: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204450.13705: Calling all_plugins_play to load vars for managed-node1 41684 1727204450.13708: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204450.13711: Calling groups_plugins_play to load vars for managed-node1 41684 1727204450.13978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204450.14318: done with get_vars() 41684 1727204450.14329: done getting variables 41684 1727204450.14376: done sending task result for task 0affcd87-79f5-3839-086d-00000000016b 41684 1727204450.14379: WORKER PROCESS EXITING TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 15:00:50 -0400 (0:00:00.032) 0:00:06.547 ***** 41684 1727204450.14570: entering _queue_task() for managed-node1/include_tasks 41684 1727204450.14935: worker is 1 (out of 1 available) 41684 1727204450.14948: exiting _queue_task() for managed-node1/include_tasks 41684 1727204450.15051: done queuing things up, now waiting for results queue to drain 41684 1727204450.15053: waiting for pending results... 41684 1727204450.15570: running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' 41684 1727204450.15821: in run() - task 0affcd87-79f5-3839-086d-00000000016c 41684 1727204450.15958: variable 'ansible_search_path' from source: unknown 41684 1727204450.15969: variable 'ansible_search_path' from source: unknown 41684 1727204450.16012: calling self._execute() 41684 1727204450.16200: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204450.16212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204450.16227: variable 'omit' from source: magic vars 41684 1727204450.17268: variable 'ansible_distribution_major_version' from source: facts 41684 1727204450.17290: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204450.17305: _execute() done 41684 1727204450.17368: dumping result to json 41684 1727204450.17377: done dumping result, returning 41684 1727204450.17388: done running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-3839-086d-00000000016c] 41684 1727204450.17402: sending task result for task 0affcd87-79f5-3839-086d-00000000016c 41684 1727204450.17540: no more pending results, returning what we have 41684 1727204450.17546: in VariableManager get_vars() 41684 1727204450.17606: Calling all_inventory to load vars for managed-node1 41684 1727204450.17609: Calling groups_inventory to load vars for managed-node1 41684 1727204450.17612: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204450.17626: Calling all_plugins_play to load vars for managed-node1 41684 1727204450.17629: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204450.17633: Calling groups_plugins_play to load vars for managed-node1 41684 1727204450.17859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204450.18102: done with get_vars() 41684 1727204450.18110: variable 'ansible_search_path' from source: unknown 41684 1727204450.18111: variable 'ansible_search_path' from source: unknown 41684 1727204450.18152: we have included files to process 41684 1727204450.18153: generating all_blocks data 41684 1727204450.18155: done generating all_blocks data 41684 1727204450.18161: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41684 1727204450.18162: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41684 1727204450.18168: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41684 1727204450.18570: in VariableManager get_vars() 41684 1727204450.18596: done with get_vars() 41684 1727204450.18794: done sending task result for task 0affcd87-79f5-3839-086d-00000000016c 41684 1727204450.18799: WORKER PROCESS EXITING 41684 1727204450.19112: done processing included file 41684 1727204450.19113: iterating over new_blocks loaded from include file 41684 1727204450.19115: in VariableManager get_vars() 41684 1727204450.19132: done with get_vars() 41684 1727204450.19169: filtering new block on tags 41684 1727204450.19186: done filtering new block on tags 41684 1727204450.19188: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 41684 1727204450.19192: extending task lists for all hosts with included blocks 41684 1727204450.19618: done extending task lists 41684 1727204450.19620: done processing included files 41684 1727204450.19621: results queue empty 41684 1727204450.19621: checking for any_errors_fatal 41684 1727204450.19625: done checking for any_errors_fatal 41684 1727204450.19626: checking for max_fail_percentage 41684 1727204450.19627: done checking for max_fail_percentage 41684 1727204450.19628: checking to see if all hosts have failed and the running result is not ok 41684 1727204450.19628: done checking to see if all hosts have failed 41684 1727204450.19629: getting the remaining hosts for this loop 41684 1727204450.19631: done getting the remaining hosts for this loop 41684 1727204450.19633: getting the next task for host managed-node1 41684 1727204450.19637: done getting next task for host managed-node1 41684 1727204450.19641: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 41684 1727204450.19644: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204450.19646: getting variables 41684 1727204450.19647: in VariableManager get_vars() 41684 1727204450.19661: Calling all_inventory to load vars for managed-node1 41684 1727204450.19663: Calling groups_inventory to load vars for managed-node1 41684 1727204450.19668: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204450.19679: Calling all_plugins_play to load vars for managed-node1 41684 1727204450.19682: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204450.19688: Calling groups_plugins_play to load vars for managed-node1 41684 1727204450.19833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204450.20017: done with get_vars() 41684 1727204450.20026: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:00:50 -0400 (0:00:00.055) 0:00:06.602 ***** 41684 1727204450.20102: entering _queue_task() for managed-node1/include_tasks 41684 1727204450.20416: worker is 1 (out of 1 available) 41684 1727204450.20427: exiting _queue_task() for managed-node1/include_tasks 41684 1727204450.20444: done queuing things up, now waiting for results queue to drain 41684 1727204450.20445: waiting for pending results... 41684 1727204450.21006: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 41684 1727204450.21190: in run() - task 0affcd87-79f5-3839-086d-00000000019d 41684 1727204450.21244: variable 'ansible_search_path' from source: unknown 41684 1727204450.21252: variable 'ansible_search_path' from source: unknown 41684 1727204450.21294: calling self._execute() 41684 1727204450.21385: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204450.21397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204450.21412: variable 'omit' from source: magic vars 41684 1727204450.21826: variable 'ansible_distribution_major_version' from source: facts 41684 1727204450.21846: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204450.21861: _execute() done 41684 1727204450.21871: dumping result to json 41684 1727204450.21879: done dumping result, returning 41684 1727204450.21889: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-3839-086d-00000000019d] 41684 1727204450.21912: sending task result for task 0affcd87-79f5-3839-086d-00000000019d 41684 1727204450.22049: no more pending results, returning what we have 41684 1727204450.22055: in VariableManager get_vars() 41684 1727204450.22105: Calling all_inventory to load vars for managed-node1 41684 1727204450.22109: Calling groups_inventory to load vars for managed-node1 41684 1727204450.22111: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204450.22127: Calling all_plugins_play to load vars for managed-node1 41684 1727204450.22131: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204450.22134: Calling groups_plugins_play to load vars for managed-node1 41684 1727204450.22341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204450.22603: done with get_vars() 41684 1727204450.22612: variable 'ansible_search_path' from source: unknown 41684 1727204450.22613: variable 'ansible_search_path' from source: unknown 41684 1727204450.22675: done sending task result for task 0affcd87-79f5-3839-086d-00000000019d 41684 1727204450.22683: WORKER PROCESS EXITING 41684 1727204450.22722: we have included files to process 41684 1727204450.22724: generating all_blocks data 41684 1727204450.22726: done generating all_blocks data 41684 1727204450.22727: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41684 1727204450.22728: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41684 1727204450.22730: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41684 1727204450.23490: done processing included file 41684 1727204450.23492: iterating over new_blocks loaded from include file 41684 1727204450.23494: in VariableManager get_vars() 41684 1727204450.23515: done with get_vars() 41684 1727204450.23516: filtering new block on tags 41684 1727204450.23535: done filtering new block on tags 41684 1727204450.23537: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 41684 1727204450.23542: extending task lists for all hosts with included blocks 41684 1727204450.23712: done extending task lists 41684 1727204450.23714: done processing included files 41684 1727204450.23715: results queue empty 41684 1727204450.23716: checking for any_errors_fatal 41684 1727204450.23719: done checking for any_errors_fatal 41684 1727204450.23720: checking for max_fail_percentage 41684 1727204450.23721: done checking for max_fail_percentage 41684 1727204450.23722: checking to see if all hosts have failed and the running result is not ok 41684 1727204450.23722: done checking to see if all hosts have failed 41684 1727204450.23723: getting the remaining hosts for this loop 41684 1727204450.23725: done getting the remaining hosts for this loop 41684 1727204450.23727: getting the next task for host managed-node1 41684 1727204450.23732: done getting next task for host managed-node1 41684 1727204450.23735: ^ task is: TASK: Gather current interface info 41684 1727204450.23780: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204450.23783: getting variables 41684 1727204450.23785: in VariableManager get_vars() 41684 1727204450.23893: Calling all_inventory to load vars for managed-node1 41684 1727204450.23895: Calling groups_inventory to load vars for managed-node1 41684 1727204450.23897: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204450.23906: Calling all_plugins_play to load vars for managed-node1 41684 1727204450.23909: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204450.23912: Calling groups_plugins_play to load vars for managed-node1 41684 1727204450.24622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204450.25048: done with get_vars() 41684 1727204450.25057: done getting variables 41684 1727204450.25097: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:00:50 -0400 (0:00:00.050) 0:00:06.652 ***** 41684 1727204450.25127: entering _queue_task() for managed-node1/command 41684 1727204450.25525: worker is 1 (out of 1 available) 41684 1727204450.25539: exiting _queue_task() for managed-node1/command 41684 1727204450.25552: done queuing things up, now waiting for results queue to drain 41684 1727204450.25554: waiting for pending results... 41684 1727204450.26944: running TaskExecutor() for managed-node1/TASK: Gather current interface info 41684 1727204450.27041: in run() - task 0affcd87-79f5-3839-086d-0000000001d4 41684 1727204450.27058: variable 'ansible_search_path' from source: unknown 41684 1727204450.27061: variable 'ansible_search_path' from source: unknown 41684 1727204450.27112: calling self._execute() 41684 1727204450.27190: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204450.27779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204450.27797: variable 'omit' from source: magic vars 41684 1727204450.28233: variable 'ansible_distribution_major_version' from source: facts 41684 1727204450.29566: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204450.29580: variable 'omit' from source: magic vars 41684 1727204450.29914: variable 'omit' from source: magic vars 41684 1727204450.29967: variable 'omit' from source: magic vars 41684 1727204450.30024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204450.30130: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204450.30273: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204450.30300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204450.30316: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204450.30350: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204450.30359: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204450.30678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204450.30789: Set connection var ansible_connection to ssh 41684 1727204450.30997: Set connection var ansible_pipelining to False 41684 1727204450.31008: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204450.31017: Set connection var ansible_timeout to 10 41684 1727204450.31029: Set connection var ansible_shell_executable to /bin/sh 41684 1727204450.31036: Set connection var ansible_shell_type to sh 41684 1727204450.31070: variable 'ansible_shell_executable' from source: unknown 41684 1727204450.31078: variable 'ansible_connection' from source: unknown 41684 1727204450.31086: variable 'ansible_module_compression' from source: unknown 41684 1727204450.31092: variable 'ansible_shell_type' from source: unknown 41684 1727204450.31099: variable 'ansible_shell_executable' from source: unknown 41684 1727204450.31105: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204450.31112: variable 'ansible_pipelining' from source: unknown 41684 1727204450.31119: variable 'ansible_timeout' from source: unknown 41684 1727204450.31125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204450.31266: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204450.31298: variable 'omit' from source: magic vars 41684 1727204450.31307: starting attempt loop 41684 1727204450.31313: running the handler 41684 1727204450.31382: _low_level_execute_command(): starting 41684 1727204450.31395: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204450.33274: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204450.33315: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204450.33338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204450.33372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204450.33418: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204450.33434: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204450.33454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204450.33480: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204450.33494: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204450.33505: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204450.33517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204450.33531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204450.33551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204450.33574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204450.33589: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204450.33604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204450.33693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204450.33711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204450.33726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204450.33819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204450.35476: stdout chunk (state=3): >>>/root <<< 41684 1727204450.35696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204450.35700: stdout chunk (state=3): >>><<< 41684 1727204450.35702: stderr chunk (state=3): >>><<< 41684 1727204450.35828: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204450.35833: _low_level_execute_command(): starting 41684 1727204450.35836: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204450.3572595-42460-51579596003926 `" && echo ansible-tmp-1727204450.3572595-42460-51579596003926="` echo /root/.ansible/tmp/ansible-tmp-1727204450.3572595-42460-51579596003926 `" ) && sleep 0' 41684 1727204450.36744: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204450.36747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204450.36787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204450.36791: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204450.36793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204450.36859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204450.36870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204450.36873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204450.36939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204450.38790: stdout chunk (state=3): >>>ansible-tmp-1727204450.3572595-42460-51579596003926=/root/.ansible/tmp/ansible-tmp-1727204450.3572595-42460-51579596003926 <<< 41684 1727204450.38986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204450.39017: stderr chunk (state=3): >>><<< 41684 1727204450.39020: stdout chunk (state=3): >>><<< 41684 1727204450.39333: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204450.3572595-42460-51579596003926=/root/.ansible/tmp/ansible-tmp-1727204450.3572595-42460-51579596003926 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204450.39337: variable 'ansible_module_compression' from source: unknown 41684 1727204450.39339: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204450.39341: variable 'ansible_facts' from source: unknown 41684 1727204450.39344: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204450.3572595-42460-51579596003926/AnsiballZ_command.py 41684 1727204450.39407: Sending initial data 41684 1727204450.39410: Sent initial data (155 bytes) 41684 1727204450.40427: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204450.40442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204450.40457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204450.40483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204450.40525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204450.40540: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204450.40555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204450.40577: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204450.40590: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204450.40601: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204450.40614: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204450.40628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204450.40645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204450.40660: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204450.40674: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204450.40687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204450.40768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204450.40801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204450.40819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204450.40899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204450.42608: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204450.42655: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204450.42708: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmp08rkstow /root/.ansible/tmp/ansible-tmp-1727204450.3572595-42460-51579596003926/AnsiballZ_command.py <<< 41684 1727204450.42757: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204450.43970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204450.44178: stderr chunk (state=3): >>><<< 41684 1727204450.44182: stdout chunk (state=3): >>><<< 41684 1727204450.44185: done transferring module to remote 41684 1727204450.44187: _low_level_execute_command(): starting 41684 1727204450.44189: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204450.3572595-42460-51579596003926/ /root/.ansible/tmp/ansible-tmp-1727204450.3572595-42460-51579596003926/AnsiballZ_command.py && sleep 0' 41684 1727204450.44829: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204450.44850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204450.44871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204450.44896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204450.44937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204450.44956: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204450.44975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204450.44993: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204450.45005: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204450.45017: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204450.45029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204450.45043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204450.45069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204450.45082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204450.45093: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204450.45105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204450.45188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204450.45211: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204450.45227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204450.45314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204450.47098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204450.47102: stdout chunk (state=3): >>><<< 41684 1727204450.47104: stderr chunk (state=3): >>><<< 41684 1727204450.47201: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204450.47205: _low_level_execute_command(): starting 41684 1727204450.47208: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204450.3572595-42460-51579596003926/AnsiballZ_command.py && sleep 0' 41684 1727204450.47825: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204450.47840: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204450.47856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204450.47885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204450.47927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204450.47940: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204450.47955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204450.47982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204450.47998: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204450.48010: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204450.48023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204450.48038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204450.48055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204450.48074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204450.48093: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204450.48169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204450.48878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204450.48927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204450.48984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204450.49082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204450.62358: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:50.619911", "end": "2024-09-24 15:00:50.622754", "delta": "0:00:00.002843", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204450.63600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204450.63683: stderr chunk (state=3): >>><<< 41684 1727204450.63687: stdout chunk (state=3): >>><<< 41684 1727204450.63825: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:50.619911", "end": "2024-09-24 15:00:50.622754", "delta": "0:00:00.002843", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204450.63835: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204450.3572595-42460-51579596003926/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204450.63838: _low_level_execute_command(): starting 41684 1727204450.63841: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204450.3572595-42460-51579596003926/ > /dev/null 2>&1 && sleep 0' 41684 1727204450.64476: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204450.64492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204450.64513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204450.64533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204450.64579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204450.64592: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204450.64607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204450.64632: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204450.64645: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204450.64657: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204450.64674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204450.64691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204450.64708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204450.64721: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204450.64739: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204450.64753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204450.64834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204450.64867: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204450.64886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204450.64979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204450.66714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204450.66813: stderr chunk (state=3): >>><<< 41684 1727204450.66824: stdout chunk (state=3): >>><<< 41684 1727204450.66869: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204450.66872: handler run complete 41684 1727204450.67176: Evaluated conditional (False): False 41684 1727204450.67179: attempt loop complete, returning result 41684 1727204450.67182: _execute() done 41684 1727204450.67184: dumping result to json 41684 1727204450.67186: done dumping result, returning 41684 1727204450.67188: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [0affcd87-79f5-3839-086d-0000000001d4] 41684 1727204450.67190: sending task result for task 0affcd87-79f5-3839-086d-0000000001d4 41684 1727204450.67324: done sending task result for task 0affcd87-79f5-3839-086d-0000000001d4 41684 1727204450.67328: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.002843", "end": "2024-09-24 15:00:50.622754", "rc": 0, "start": "2024-09-24 15:00:50.619911" } STDOUT: bonding_masters eth0 lo rpltstbr 41684 1727204450.67728: no more pending results, returning what we have 41684 1727204450.67731: results queue empty 41684 1727204450.67732: checking for any_errors_fatal 41684 1727204450.67734: done checking for any_errors_fatal 41684 1727204450.67735: checking for max_fail_percentage 41684 1727204450.67736: done checking for max_fail_percentage 41684 1727204450.67737: checking to see if all hosts have failed and the running result is not ok 41684 1727204450.67738: done checking to see if all hosts have failed 41684 1727204450.67739: getting the remaining hosts for this loop 41684 1727204450.67740: done getting the remaining hosts for this loop 41684 1727204450.67745: getting the next task for host managed-node1 41684 1727204450.67751: done getting next task for host managed-node1 41684 1727204450.67754: ^ task is: TASK: Set current_interfaces 41684 1727204450.67759: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204450.67767: getting variables 41684 1727204450.67769: in VariableManager get_vars() 41684 1727204450.67872: Calling all_inventory to load vars for managed-node1 41684 1727204450.67875: Calling groups_inventory to load vars for managed-node1 41684 1727204450.67878: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204450.67889: Calling all_plugins_play to load vars for managed-node1 41684 1727204450.67892: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204450.67899: Calling groups_plugins_play to load vars for managed-node1 41684 1727204450.68309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204450.68547: done with get_vars() 41684 1727204450.68559: done getting variables 41684 1727204450.68626: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:00:50 -0400 (0:00:00.435) 0:00:07.088 ***** 41684 1727204450.68659: entering _queue_task() for managed-node1/set_fact 41684 1727204450.68970: worker is 1 (out of 1 available) 41684 1727204450.68984: exiting _queue_task() for managed-node1/set_fact 41684 1727204450.68998: done queuing things up, now waiting for results queue to drain 41684 1727204450.68999: waiting for pending results... 41684 1727204450.69297: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 41684 1727204450.69430: in run() - task 0affcd87-79f5-3839-086d-0000000001d5 41684 1727204450.69455: variable 'ansible_search_path' from source: unknown 41684 1727204450.69468: variable 'ansible_search_path' from source: unknown 41684 1727204450.69513: calling self._execute() 41684 1727204450.69603: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204450.69617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204450.69631: variable 'omit' from source: magic vars 41684 1727204450.70200: variable 'ansible_distribution_major_version' from source: facts 41684 1727204450.70222: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204450.70237: variable 'omit' from source: magic vars 41684 1727204450.70302: variable 'omit' from source: magic vars 41684 1727204450.70421: variable '_current_interfaces' from source: set_fact 41684 1727204450.70502: variable 'omit' from source: magic vars 41684 1727204450.70545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204450.70593: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204450.70622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204450.70645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204450.70671: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204450.70706: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204450.70718: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204450.70726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204450.70845: Set connection var ansible_connection to ssh 41684 1727204450.70857: Set connection var ansible_pipelining to False 41684 1727204450.70873: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204450.70888: Set connection var ansible_timeout to 10 41684 1727204450.70901: Set connection var ansible_shell_executable to /bin/sh 41684 1727204450.70908: Set connection var ansible_shell_type to sh 41684 1727204450.70943: variable 'ansible_shell_executable' from source: unknown 41684 1727204450.70951: variable 'ansible_connection' from source: unknown 41684 1727204450.70958: variable 'ansible_module_compression' from source: unknown 41684 1727204450.70970: variable 'ansible_shell_type' from source: unknown 41684 1727204450.70978: variable 'ansible_shell_executable' from source: unknown 41684 1727204450.70989: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204450.70998: variable 'ansible_pipelining' from source: unknown 41684 1727204450.71005: variable 'ansible_timeout' from source: unknown 41684 1727204450.71012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204450.71174: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204450.71191: variable 'omit' from source: magic vars 41684 1727204450.71206: starting attempt loop 41684 1727204450.71214: running the handler 41684 1727204450.71230: handler run complete 41684 1727204450.71246: attempt loop complete, returning result 41684 1727204450.71258: _execute() done 41684 1727204450.71272: dumping result to json 41684 1727204450.71280: done dumping result, returning 41684 1727204450.71292: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [0affcd87-79f5-3839-086d-0000000001d5] 41684 1727204450.71303: sending task result for task 0affcd87-79f5-3839-086d-0000000001d5 41684 1727204450.71414: done sending task result for task 0affcd87-79f5-3839-086d-0000000001d5 ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 41684 1727204450.71486: no more pending results, returning what we have 41684 1727204450.71490: results queue empty 41684 1727204450.71491: checking for any_errors_fatal 41684 1727204450.71498: done checking for any_errors_fatal 41684 1727204450.71499: checking for max_fail_percentage 41684 1727204450.71501: done checking for max_fail_percentage 41684 1727204450.71502: checking to see if all hosts have failed and the running result is not ok 41684 1727204450.71503: done checking to see if all hosts have failed 41684 1727204450.71503: getting the remaining hosts for this loop 41684 1727204450.71505: done getting the remaining hosts for this loop 41684 1727204450.71509: getting the next task for host managed-node1 41684 1727204450.71521: done getting next task for host managed-node1 41684 1727204450.71524: ^ task is: TASK: Show current_interfaces 41684 1727204450.71528: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204450.71531: getting variables 41684 1727204450.71533: in VariableManager get_vars() 41684 1727204450.71583: Calling all_inventory to load vars for managed-node1 41684 1727204450.71586: Calling groups_inventory to load vars for managed-node1 41684 1727204450.71589: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204450.71601: Calling all_plugins_play to load vars for managed-node1 41684 1727204450.71604: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204450.71607: Calling groups_plugins_play to load vars for managed-node1 41684 1727204450.71851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204450.73190: done with get_vars() 41684 1727204450.73204: done getting variables 41684 1727204450.73269: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:00:50 -0400 (0:00:00.046) 0:00:07.134 ***** 41684 1727204450.73309: entering _queue_task() for managed-node1/debug 41684 1727204450.74182: WORKER PROCESS EXITING 41684 1727204450.74716: worker is 1 (out of 1 available) 41684 1727204450.74730: exiting _queue_task() for managed-node1/debug 41684 1727204450.74742: done queuing things up, now waiting for results queue to drain 41684 1727204450.74743: waiting for pending results... 41684 1727204450.75830: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 41684 1727204450.75921: in run() - task 0affcd87-79f5-3839-086d-00000000019e 41684 1727204450.75934: variable 'ansible_search_path' from source: unknown 41684 1727204450.75938: variable 'ansible_search_path' from source: unknown 41684 1727204450.75980: calling self._execute() 41684 1727204450.76136: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204450.76140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204450.76149: variable 'omit' from source: magic vars 41684 1727204450.77205: variable 'ansible_distribution_major_version' from source: facts 41684 1727204450.77218: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204450.77225: variable 'omit' from source: magic vars 41684 1727204450.77287: variable 'omit' from source: magic vars 41684 1727204450.77408: variable 'current_interfaces' from source: set_fact 41684 1727204450.77445: variable 'omit' from source: magic vars 41684 1727204450.77493: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204450.78208: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204450.78237: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204450.78260: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204450.78279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204450.78312: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204450.78322: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204450.78328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204450.78417: Set connection var ansible_connection to ssh 41684 1727204450.78428: Set connection var ansible_pipelining to False 41684 1727204450.78436: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204450.78444: Set connection var ansible_timeout to 10 41684 1727204450.78454: Set connection var ansible_shell_executable to /bin/sh 41684 1727204450.78459: Set connection var ansible_shell_type to sh 41684 1727204450.78493: variable 'ansible_shell_executable' from source: unknown 41684 1727204450.78504: variable 'ansible_connection' from source: unknown 41684 1727204450.78510: variable 'ansible_module_compression' from source: unknown 41684 1727204450.78514: variable 'ansible_shell_type' from source: unknown 41684 1727204450.78520: variable 'ansible_shell_executable' from source: unknown 41684 1727204450.78524: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204450.78530: variable 'ansible_pipelining' from source: unknown 41684 1727204450.78535: variable 'ansible_timeout' from source: unknown 41684 1727204450.78541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204450.78688: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204450.78706: variable 'omit' from source: magic vars 41684 1727204450.78716: starting attempt loop 41684 1727204450.78722: running the handler 41684 1727204450.78775: handler run complete 41684 1727204450.78795: attempt loop complete, returning result 41684 1727204450.78802: _execute() done 41684 1727204450.78810: dumping result to json 41684 1727204450.78816: done dumping result, returning 41684 1727204450.78828: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [0affcd87-79f5-3839-086d-00000000019e] 41684 1727204450.78838: sending task result for task 0affcd87-79f5-3839-086d-00000000019e 41684 1727204450.78948: done sending task result for task 0affcd87-79f5-3839-086d-00000000019e 41684 1727204450.78955: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 41684 1727204450.79075: no more pending results, returning what we have 41684 1727204450.79079: results queue empty 41684 1727204450.79080: checking for any_errors_fatal 41684 1727204450.79083: done checking for any_errors_fatal 41684 1727204450.79084: checking for max_fail_percentage 41684 1727204450.79086: done checking for max_fail_percentage 41684 1727204450.79086: checking to see if all hosts have failed and the running result is not ok 41684 1727204450.79087: done checking to see if all hosts have failed 41684 1727204450.79087: getting the remaining hosts for this loop 41684 1727204450.79089: done getting the remaining hosts for this loop 41684 1727204450.79093: getting the next task for host managed-node1 41684 1727204450.79100: done getting next task for host managed-node1 41684 1727204450.79103: ^ task is: TASK: Install iproute 41684 1727204450.79105: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204450.79108: getting variables 41684 1727204450.79110: in VariableManager get_vars() 41684 1727204450.79145: Calling all_inventory to load vars for managed-node1 41684 1727204450.79147: Calling groups_inventory to load vars for managed-node1 41684 1727204450.79150: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204450.79160: Calling all_plugins_play to load vars for managed-node1 41684 1727204450.79166: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204450.79170: Calling groups_plugins_play to load vars for managed-node1 41684 1727204450.79343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204450.79580: done with get_vars() 41684 1727204450.79669: done getting variables 41684 1727204450.79734: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 15:00:50 -0400 (0:00:00.064) 0:00:07.199 ***** 41684 1727204450.79768: entering _queue_task() for managed-node1/package 41684 1727204450.80533: worker is 1 (out of 1 available) 41684 1727204450.80547: exiting _queue_task() for managed-node1/package 41684 1727204450.80559: done queuing things up, now waiting for results queue to drain 41684 1727204450.80561: waiting for pending results... 41684 1727204450.81736: running TaskExecutor() for managed-node1/TASK: Install iproute 41684 1727204450.81840: in run() - task 0affcd87-79f5-3839-086d-00000000016d 41684 1727204450.81863: variable 'ansible_search_path' from source: unknown 41684 1727204450.81875: variable 'ansible_search_path' from source: unknown 41684 1727204450.81919: calling self._execute() 41684 1727204450.82008: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204450.82020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204450.82036: variable 'omit' from source: magic vars 41684 1727204450.82408: variable 'ansible_distribution_major_version' from source: facts 41684 1727204450.82887: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204450.82897: variable 'omit' from source: magic vars 41684 1727204450.82938: variable 'omit' from source: magic vars 41684 1727204450.83133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204450.88137: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204450.88746: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204450.88793: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204450.88832: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204450.88868: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204450.88968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204450.89195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204450.89228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204450.89280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204450.89787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204450.89903: variable '__network_is_ostree' from source: set_fact 41684 1727204450.89936: variable 'omit' from source: magic vars 41684 1727204450.89975: variable 'omit' from source: magic vars 41684 1727204450.90009: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204450.90039: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204450.90063: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204450.90088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204450.90101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204450.90134: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204450.90141: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204450.90148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204450.90250: Set connection var ansible_connection to ssh 41684 1727204450.90260: Set connection var ansible_pipelining to False 41684 1727204450.90272: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204450.90282: Set connection var ansible_timeout to 10 41684 1727204450.90292: Set connection var ansible_shell_executable to /bin/sh 41684 1727204450.90298: Set connection var ansible_shell_type to sh 41684 1727204450.90444: variable 'ansible_shell_executable' from source: unknown 41684 1727204450.90453: variable 'ansible_connection' from source: unknown 41684 1727204450.90460: variable 'ansible_module_compression' from source: unknown 41684 1727204450.90468: variable 'ansible_shell_type' from source: unknown 41684 1727204450.90475: variable 'ansible_shell_executable' from source: unknown 41684 1727204450.90481: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204450.90488: variable 'ansible_pipelining' from source: unknown 41684 1727204450.90495: variable 'ansible_timeout' from source: unknown 41684 1727204450.90502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204450.90614: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204450.90773: variable 'omit' from source: magic vars 41684 1727204450.90783: starting attempt loop 41684 1727204450.90833: running the handler 41684 1727204450.90845: variable 'ansible_facts' from source: unknown 41684 1727204450.90852: variable 'ansible_facts' from source: unknown 41684 1727204450.90897: _low_level_execute_command(): starting 41684 1727204450.90908: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204450.92688: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204450.92893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204450.92897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204450.92936: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204450.92940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204450.92943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204450.92947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204450.93002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204450.93087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204450.93090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204450.93299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204450.94830: stdout chunk (state=3): >>>/root <<< 41684 1727204450.94985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204450.95049: stderr chunk (state=3): >>><<< 41684 1727204450.95052: stdout chunk (state=3): >>><<< 41684 1727204450.95176: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204450.95179: _low_level_execute_command(): starting 41684 1727204450.95182: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204450.950793-42500-5804330574150 `" && echo ansible-tmp-1727204450.950793-42500-5804330574150="` echo /root/.ansible/tmp/ansible-tmp-1727204450.950793-42500-5804330574150 `" ) && sleep 0' 41684 1727204450.95981: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204450.95998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204450.96007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204450.96068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204450.96111: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204450.96124: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204450.96139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204450.96157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204450.96178: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204450.96190: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204450.96203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204450.96216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204450.96233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204450.96245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204450.96256: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204450.96275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204450.96355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204450.96384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204450.96406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204450.96496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204450.98355: stdout chunk (state=3): >>>ansible-tmp-1727204450.950793-42500-5804330574150=/root/.ansible/tmp/ansible-tmp-1727204450.950793-42500-5804330574150 <<< 41684 1727204450.98570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204450.98575: stdout chunk (state=3): >>><<< 41684 1727204450.98578: stderr chunk (state=3): >>><<< 41684 1727204450.98672: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204450.950793-42500-5804330574150=/root/.ansible/tmp/ansible-tmp-1727204450.950793-42500-5804330574150 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204450.98677: variable 'ansible_module_compression' from source: unknown 41684 1727204450.98794: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 41684 1727204450.98798: ANSIBALLZ: Acquiring lock 41684 1727204450.98801: ANSIBALLZ: Lock acquired: 139842516808240 41684 1727204450.98803: ANSIBALLZ: Creating module 41684 1727204451.30124: ANSIBALLZ: Writing module into payload 41684 1727204451.30798: ANSIBALLZ: Writing module 41684 1727204451.30975: ANSIBALLZ: Renaming module 41684 1727204451.31226: ANSIBALLZ: Done creating module 41684 1727204451.31251: variable 'ansible_facts' from source: unknown 41684 1727204451.31348: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204450.950793-42500-5804330574150/AnsiballZ_dnf.py 41684 1727204451.32100: Sending initial data 41684 1727204451.32109: Sent initial data (149 bytes) 41684 1727204451.34659: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204451.34681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204451.34695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204451.34714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204451.34811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204451.34824: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204451.34842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204451.34861: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204451.34878: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204451.34892: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204451.34904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204451.34918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204451.34983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204451.35001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204451.35017: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204451.35031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204451.35221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204451.35239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204451.35282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204451.35437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204451.37129: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204451.37187: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204451.37234: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpzdo9p0gh /root/.ansible/tmp/ansible-tmp-1727204450.950793-42500-5804330574150/AnsiballZ_dnf.py <<< 41684 1727204451.37290: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204451.38987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204451.39173: stderr chunk (state=3): >>><<< 41684 1727204451.39176: stdout chunk (state=3): >>><<< 41684 1727204451.39179: done transferring module to remote 41684 1727204451.39182: _low_level_execute_command(): starting 41684 1727204451.39255: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204450.950793-42500-5804330574150/ /root/.ansible/tmp/ansible-tmp-1727204450.950793-42500-5804330574150/AnsiballZ_dnf.py && sleep 0' 41684 1727204451.40925: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204451.40936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204451.40952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204451.40971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204451.41045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204451.41054: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204451.41073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204451.41130: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204451.41136: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204451.41143: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204451.41151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204451.41160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204451.41178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204451.41185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204451.41192: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204451.41201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204451.41334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204451.41354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204451.41404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204451.41487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204451.43290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204451.43295: stdout chunk (state=3): >>><<< 41684 1727204451.43302: stderr chunk (state=3): >>><<< 41684 1727204451.43327: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204451.43331: _low_level_execute_command(): starting 41684 1727204451.43335: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204450.950793-42500-5804330574150/AnsiballZ_dnf.py && sleep 0' 41684 1727204451.46160: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204451.46240: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204451.46258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204451.46283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204451.46330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204451.46388: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204451.46404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204451.46422: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204451.46434: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204451.46449: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204451.46462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204451.46482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204451.46501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204451.46572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204451.46586: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204451.46600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204451.46840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204451.46866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204451.46882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204451.47069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204452.38874: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 41684 1727204452.58071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204452.58077: stdout chunk (state=3): >>><<< 41684 1727204452.58084: stderr chunk (state=3): >>><<< 41684 1727204452.58103: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204452.58151: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204450.950793-42500-5804330574150/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204452.58157: _low_level_execute_command(): starting 41684 1727204452.58166: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204450.950793-42500-5804330574150/ > /dev/null 2>&1 && sleep 0' 41684 1727204452.58819: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204452.58823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204452.58826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204452.58844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204452.58888: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204452.58892: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204452.58903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204452.58921: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204452.58924: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204452.58927: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204452.58935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204452.58945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204452.58955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204452.58967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204452.58970: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204452.58982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204452.59057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204452.59071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204452.59091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204452.59172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204452.61015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204452.61092: stderr chunk (state=3): >>><<< 41684 1727204452.61110: stdout chunk (state=3): >>><<< 41684 1727204452.61375: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204452.61379: handler run complete 41684 1727204452.61386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204452.61535: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204452.61582: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204452.61625: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204452.61659: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204452.61748: variable '__install_status' from source: unknown 41684 1727204452.61780: Evaluated conditional (__install_status is success): True 41684 1727204452.61803: attempt loop complete, returning result 41684 1727204452.61818: _execute() done 41684 1727204452.61826: dumping result to json 41684 1727204452.61836: done dumping result, returning 41684 1727204452.61850: done running TaskExecutor() for managed-node1/TASK: Install iproute [0affcd87-79f5-3839-086d-00000000016d] 41684 1727204452.61859: sending task result for task 0affcd87-79f5-3839-086d-00000000016d ok: [managed-node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 41684 1727204452.62077: no more pending results, returning what we have 41684 1727204452.62081: results queue empty 41684 1727204452.62082: checking for any_errors_fatal 41684 1727204452.62090: done checking for any_errors_fatal 41684 1727204452.62091: checking for max_fail_percentage 41684 1727204452.62092: done checking for max_fail_percentage 41684 1727204452.62093: checking to see if all hosts have failed and the running result is not ok 41684 1727204452.62094: done checking to see if all hosts have failed 41684 1727204452.62095: getting the remaining hosts for this loop 41684 1727204452.62096: done getting the remaining hosts for this loop 41684 1727204452.62100: getting the next task for host managed-node1 41684 1727204452.62107: done getting next task for host managed-node1 41684 1727204452.62110: ^ task is: TASK: Create veth interface {{ interface }} 41684 1727204452.62113: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204452.62117: getting variables 41684 1727204452.62119: in VariableManager get_vars() 41684 1727204452.62168: Calling all_inventory to load vars for managed-node1 41684 1727204452.62171: Calling groups_inventory to load vars for managed-node1 41684 1727204452.62174: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204452.62194: Calling all_plugins_play to load vars for managed-node1 41684 1727204452.62198: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204452.62204: Calling groups_plugins_play to load vars for managed-node1 41684 1727204452.62605: done sending task result for task 0affcd87-79f5-3839-086d-00000000016d 41684 1727204452.62608: WORKER PROCESS EXITING 41684 1727204452.62627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204452.62753: done with get_vars() 41684 1727204452.62760: done getting variables 41684 1727204452.62809: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204452.62899: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 15:00:52 -0400 (0:00:01.831) 0:00:09.030 ***** 41684 1727204452.62931: entering _queue_task() for managed-node1/command 41684 1727204452.63139: worker is 1 (out of 1 available) 41684 1727204452.63151: exiting _queue_task() for managed-node1/command 41684 1727204452.63169: done queuing things up, now waiting for results queue to drain 41684 1727204452.63170: waiting for pending results... 41684 1727204452.63325: running TaskExecutor() for managed-node1/TASK: Create veth interface ethtest0 41684 1727204452.63403: in run() - task 0affcd87-79f5-3839-086d-00000000016e 41684 1727204452.63414: variable 'ansible_search_path' from source: unknown 41684 1727204452.63417: variable 'ansible_search_path' from source: unknown 41684 1727204452.63635: variable 'interface' from source: set_fact 41684 1727204452.63700: variable 'interface' from source: set_fact 41684 1727204452.63751: variable 'interface' from source: set_fact 41684 1727204452.63860: Loaded config def from plugin (lookup/items) 41684 1727204452.63869: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 41684 1727204452.63884: variable 'omit' from source: magic vars 41684 1727204452.63970: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204452.63978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204452.63987: variable 'omit' from source: magic vars 41684 1727204452.64156: variable 'ansible_distribution_major_version' from source: facts 41684 1727204452.64167: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204452.64348: variable 'type' from source: set_fact 41684 1727204452.64358: variable 'state' from source: include params 41684 1727204452.64369: variable 'interface' from source: set_fact 41684 1727204452.64377: variable 'current_interfaces' from source: set_fact 41684 1727204452.64388: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 41684 1727204452.64397: variable 'omit' from source: magic vars 41684 1727204452.64434: variable 'omit' from source: magic vars 41684 1727204452.64484: variable 'item' from source: unknown 41684 1727204452.64559: variable 'item' from source: unknown 41684 1727204452.64582: variable 'omit' from source: magic vars 41684 1727204452.64614: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204452.64645: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204452.64671: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204452.64695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204452.64711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204452.64745: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204452.64753: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204452.64761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204452.64862: Set connection var ansible_connection to ssh 41684 1727204452.64877: Set connection var ansible_pipelining to False 41684 1727204452.64887: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204452.64898: Set connection var ansible_timeout to 10 41684 1727204452.64910: Set connection var ansible_shell_executable to /bin/sh 41684 1727204452.64918: Set connection var ansible_shell_type to sh 41684 1727204452.64943: variable 'ansible_shell_executable' from source: unknown 41684 1727204452.64951: variable 'ansible_connection' from source: unknown 41684 1727204452.64960: variable 'ansible_module_compression' from source: unknown 41684 1727204452.64970: variable 'ansible_shell_type' from source: unknown 41684 1727204452.64977: variable 'ansible_shell_executable' from source: unknown 41684 1727204452.64984: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204452.64992: variable 'ansible_pipelining' from source: unknown 41684 1727204452.64998: variable 'ansible_timeout' from source: unknown 41684 1727204452.65005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204452.65141: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204452.65158: variable 'omit' from source: magic vars 41684 1727204452.65169: starting attempt loop 41684 1727204452.65175: running the handler 41684 1727204452.65195: _low_level_execute_command(): starting 41684 1727204452.65205: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204452.65961: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204452.65981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204452.65999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204452.66017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204452.66060: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204452.66078: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204452.66093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204452.66113: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204452.66126: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204452.66138: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204452.66149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204452.66165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204452.66185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204452.66197: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204452.66209: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204452.66223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204452.66301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204452.66322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204452.66337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204452.66426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204452.67963: stdout chunk (state=3): >>>/root <<< 41684 1727204452.68071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204452.68127: stderr chunk (state=3): >>><<< 41684 1727204452.68131: stdout chunk (state=3): >>><<< 41684 1727204452.68153: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204452.68173: _low_level_execute_command(): starting 41684 1727204452.68218: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204452.6815226-42905-39652478989429 `" && echo ansible-tmp-1727204452.6815226-42905-39652478989429="` echo /root/.ansible/tmp/ansible-tmp-1727204452.6815226-42905-39652478989429 `" ) && sleep 0' 41684 1727204452.68955: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204452.68961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204452.68972: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204452.69090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204452.69132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204452.71140: stdout chunk (state=3): >>>ansible-tmp-1727204452.6815226-42905-39652478989429=/root/.ansible/tmp/ansible-tmp-1727204452.6815226-42905-39652478989429 <<< 41684 1727204452.71282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204452.71344: stderr chunk (state=3): >>><<< 41684 1727204452.71348: stdout chunk (state=3): >>><<< 41684 1727204452.71573: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204452.6815226-42905-39652478989429=/root/.ansible/tmp/ansible-tmp-1727204452.6815226-42905-39652478989429 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204452.71576: variable 'ansible_module_compression' from source: unknown 41684 1727204452.71579: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204452.71581: variable 'ansible_facts' from source: unknown 41684 1727204452.71589: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204452.6815226-42905-39652478989429/AnsiballZ_command.py 41684 1727204452.71997: Sending initial data 41684 1727204452.72000: Sent initial data (155 bytes) 41684 1727204452.73021: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204452.73025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204452.73059: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204452.73063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204452.73067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204452.73069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204452.73141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204452.73144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204452.73146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204452.73213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204452.74930: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204452.74989: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204452.75043: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpr7pi46ud /root/.ansible/tmp/ansible-tmp-1727204452.6815226-42905-39652478989429/AnsiballZ_command.py <<< 41684 1727204452.75101: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204452.76344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204452.76558: stderr chunk (state=3): >>><<< 41684 1727204452.76562: stdout chunk (state=3): >>><<< 41684 1727204452.76572: done transferring module to remote 41684 1727204452.76574: _low_level_execute_command(): starting 41684 1727204452.76577: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204452.6815226-42905-39652478989429/ /root/.ansible/tmp/ansible-tmp-1727204452.6815226-42905-39652478989429/AnsiballZ_command.py && sleep 0' 41684 1727204452.77254: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204452.77274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204452.77295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204452.77320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204452.77378: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204452.77395: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204452.77415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204452.77446: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204452.77459: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204452.77477: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204452.77491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204452.77509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204452.77526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204452.77549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204452.77569: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204452.77584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204452.77668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204452.77699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204452.77719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204452.77877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204452.79522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204452.79645: stderr chunk (state=3): >>><<< 41684 1727204452.79689: stdout chunk (state=3): >>><<< 41684 1727204452.79769: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204452.79772: _low_level_execute_command(): starting 41684 1727204452.79842: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204452.6815226-42905-39652478989429/AnsiballZ_command.py && sleep 0' 41684 1727204452.81502: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204452.81540: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204452.81556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204452.81579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204452.81628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204452.81642: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204452.81656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204452.81679: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204452.81691: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204452.81701: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204452.81714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204452.81809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204452.81826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204452.81839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204452.81850: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204452.81869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204452.81945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204452.81973: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204452.81991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204452.82088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204452.96084: stdout chunk (state=3): >>> <<< 41684 1727204452.96102: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 15:00:52.949600", "end": "2024-09-24 15:00:52.959562", "delta": "0:00:00.009962", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204452.98792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204452.98852: stderr chunk (state=3): >>><<< 41684 1727204452.98866: stdout chunk (state=3): >>><<< 41684 1727204452.98966: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 15:00:52.949600", "end": "2024-09-24 15:00:52.959562", "delta": "0:00:00.009962", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204452.98975: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204452.6815226-42905-39652478989429/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204452.98978: _low_level_execute_command(): starting 41684 1727204452.98980: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204452.6815226-42905-39652478989429/ > /dev/null 2>&1 && sleep 0' 41684 1727204453.00004: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204453.00023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.00144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.00223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.00322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.00399: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204453.00414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.00460: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204453.00513: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204453.00524: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204453.00561: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.00600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.00633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.00646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.00658: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204453.00679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.00769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204453.00796: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204453.00853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204453.00951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204453.02860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204453.03011: stderr chunk (state=3): >>><<< 41684 1727204453.03022: stdout chunk (state=3): >>><<< 41684 1727204453.03152: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204453.03156: handler run complete 41684 1727204453.03158: Evaluated conditional (False): False 41684 1727204453.03160: attempt loop complete, returning result 41684 1727204453.03166: variable 'item' from source: unknown 41684 1727204453.03260: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.009962", "end": "2024-09-24 15:00:52.959562", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-24 15:00:52.949600" } 41684 1727204453.03523: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204453.03527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204453.03645: variable 'omit' from source: magic vars 41684 1727204453.03739: variable 'ansible_distribution_major_version' from source: facts 41684 1727204453.03756: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204453.03943: variable 'type' from source: set_fact 41684 1727204453.03954: variable 'state' from source: include params 41684 1727204453.03968: variable 'interface' from source: set_fact 41684 1727204453.03984: variable 'current_interfaces' from source: set_fact 41684 1727204453.03995: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 41684 1727204453.04003: variable 'omit' from source: magic vars 41684 1727204453.04023: variable 'omit' from source: magic vars 41684 1727204453.04100: variable 'item' from source: unknown 41684 1727204453.04330: variable 'item' from source: unknown 41684 1727204453.04351: variable 'omit' from source: magic vars 41684 1727204453.04389: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204453.04403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204453.04413: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204453.04443: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204453.04452: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204453.04459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204453.04567: Set connection var ansible_connection to ssh 41684 1727204453.04581: Set connection var ansible_pipelining to False 41684 1727204453.04591: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204453.04601: Set connection var ansible_timeout to 10 41684 1727204453.04612: Set connection var ansible_shell_executable to /bin/sh 41684 1727204453.04618: Set connection var ansible_shell_type to sh 41684 1727204453.04651: variable 'ansible_shell_executable' from source: unknown 41684 1727204453.04666: variable 'ansible_connection' from source: unknown 41684 1727204453.04674: variable 'ansible_module_compression' from source: unknown 41684 1727204453.04689: variable 'ansible_shell_type' from source: unknown 41684 1727204453.04697: variable 'ansible_shell_executable' from source: unknown 41684 1727204453.04704: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204453.04712: variable 'ansible_pipelining' from source: unknown 41684 1727204453.04800: variable 'ansible_timeout' from source: unknown 41684 1727204453.04810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204453.04927: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204453.04942: variable 'omit' from source: magic vars 41684 1727204453.04951: starting attempt loop 41684 1727204453.04958: running the handler 41684 1727204453.04975: _low_level_execute_command(): starting 41684 1727204453.04986: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204453.07050: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204453.07070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.07088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.07104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.07157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.07174: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204453.07237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.07261: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204453.07278: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204453.07289: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204453.07329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.08437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.08542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.08565: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.08580: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204453.08595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.08698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204453.08766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204453.08786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204453.08882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204453.10409: stdout chunk (state=3): >>>/root <<< 41684 1727204453.10615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204453.10619: stdout chunk (state=3): >>><<< 41684 1727204453.10621: stderr chunk (state=3): >>><<< 41684 1727204453.10732: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204453.10735: _low_level_execute_command(): starting 41684 1727204453.10738: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204453.1063826-42905-155577681013608 `" && echo ansible-tmp-1727204453.1063826-42905-155577681013608="` echo /root/.ansible/tmp/ansible-tmp-1727204453.1063826-42905-155577681013608 `" ) && sleep 0' 41684 1727204453.11381: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204453.11402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.11416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.11432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.11481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.11494: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204453.11511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.11527: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204453.11538: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204453.11547: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204453.11558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.11576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.11591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.11602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.11618: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204453.11630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.13550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204453.13588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204453.13608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204453.13709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204453.15578: stdout chunk (state=3): >>>ansible-tmp-1727204453.1063826-42905-155577681013608=/root/.ansible/tmp/ansible-tmp-1727204453.1063826-42905-155577681013608 <<< 41684 1727204453.15799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204453.15804: stdout chunk (state=3): >>><<< 41684 1727204453.15806: stderr chunk (state=3): >>><<< 41684 1727204453.16073: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204453.1063826-42905-155577681013608=/root/.ansible/tmp/ansible-tmp-1727204453.1063826-42905-155577681013608 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204453.16083: variable 'ansible_module_compression' from source: unknown 41684 1727204453.16086: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204453.16088: variable 'ansible_facts' from source: unknown 41684 1727204453.16090: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204453.1063826-42905-155577681013608/AnsiballZ_command.py 41684 1727204453.16816: Sending initial data 41684 1727204453.16820: Sent initial data (156 bytes) 41684 1727204453.18872: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204453.18895: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.18912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.18940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.18989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.19001: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204453.19018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.19167: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204453.19184: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204453.19197: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204453.19209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.19222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.19237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.19250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.19271: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204453.19285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.19365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204453.19394: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204453.19410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204453.19500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204453.21223: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204453.21276: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204453.21326: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmp992ij972 /root/.ansible/tmp/ansible-tmp-1727204453.1063826-42905-155577681013608/AnsiballZ_command.py <<< 41684 1727204453.21379: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204453.22991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204453.23118: stderr chunk (state=3): >>><<< 41684 1727204453.23122: stdout chunk (state=3): >>><<< 41684 1727204453.23124: done transferring module to remote 41684 1727204453.23126: _low_level_execute_command(): starting 41684 1727204453.23129: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204453.1063826-42905-155577681013608/ /root/.ansible/tmp/ansible-tmp-1727204453.1063826-42905-155577681013608/AnsiballZ_command.py && sleep 0' 41684 1727204453.24187: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204453.24611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.24648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.24672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.24726: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.24739: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204453.24752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.24775: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204453.24788: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204453.24799: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204453.24812: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.24826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.24844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.24856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.24870: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204453.24883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.24952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204453.24980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204453.24999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204453.25091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204453.27451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204453.28347: stderr chunk (state=3): >>><<< 41684 1727204453.28394: stdout chunk (state=3): >>><<< 41684 1727204453.28515: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204453.28523: _low_level_execute_command(): starting 41684 1727204453.28526: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204453.1063826-42905-155577681013608/AnsiballZ_command.py && sleep 0' 41684 1727204453.29333: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.29337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.29377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.29380: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.29384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.29448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204453.30088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204453.30368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204453.43630: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 15:00:53.432176", "end": "2024-09-24 15:00:53.435523", "delta": "0:00:00.003347", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204453.44801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204453.44840: stderr chunk (state=3): >>><<< 41684 1727204453.44844: stdout chunk (state=3): >>><<< 41684 1727204453.44847: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 15:00:53.432176", "end": "2024-09-24 15:00:53.435523", "delta": "0:00:00.003347", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204453.44942: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204453.1063826-42905-155577681013608/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204453.44946: _low_level_execute_command(): starting 41684 1727204453.44948: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204453.1063826-42905-155577681013608/ > /dev/null 2>&1 && sleep 0' 41684 1727204453.45867: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204453.45876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.45879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.45882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.45884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.45886: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204453.45888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.45890: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204453.45892: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204453.45894: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204453.45896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.45898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.45900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.45902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.45904: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204453.45906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.45908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204453.45910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204453.45912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204453.46014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204453.47744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204453.47825: stderr chunk (state=3): >>><<< 41684 1727204453.47828: stdout chunk (state=3): >>><<< 41684 1727204453.47974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204453.47982: handler run complete 41684 1727204453.47984: Evaluated conditional (False): False 41684 1727204453.47986: attempt loop complete, returning result 41684 1727204453.47988: variable 'item' from source: unknown 41684 1727204453.48084: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.003347", "end": "2024-09-24 15:00:53.435523", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-24 15:00:53.432176" } 41684 1727204453.48308: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204453.48312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204453.48314: variable 'omit' from source: magic vars 41684 1727204453.48489: variable 'ansible_distribution_major_version' from source: facts 41684 1727204453.48508: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204453.48763: variable 'type' from source: set_fact 41684 1727204453.48775: variable 'state' from source: include params 41684 1727204453.48783: variable 'interface' from source: set_fact 41684 1727204453.48791: variable 'current_interfaces' from source: set_fact 41684 1727204453.48808: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 41684 1727204453.48824: variable 'omit' from source: magic vars 41684 1727204453.48851: variable 'omit' from source: magic vars 41684 1727204453.48897: variable 'item' from source: unknown 41684 1727204453.48981: variable 'item' from source: unknown 41684 1727204453.49001: variable 'omit' from source: magic vars 41684 1727204453.49027: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204453.49042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204453.49053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204453.49091: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204453.49099: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204453.49107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204453.49197: Set connection var ansible_connection to ssh 41684 1727204453.49208: Set connection var ansible_pipelining to False 41684 1727204453.49228: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204453.49238: Set connection var ansible_timeout to 10 41684 1727204453.49252: Set connection var ansible_shell_executable to /bin/sh 41684 1727204453.49259: Set connection var ansible_shell_type to sh 41684 1727204453.49294: variable 'ansible_shell_executable' from source: unknown 41684 1727204453.49306: variable 'ansible_connection' from source: unknown 41684 1727204453.49314: variable 'ansible_module_compression' from source: unknown 41684 1727204453.49324: variable 'ansible_shell_type' from source: unknown 41684 1727204453.49331: variable 'ansible_shell_executable' from source: unknown 41684 1727204453.49336: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204453.49342: variable 'ansible_pipelining' from source: unknown 41684 1727204453.49347: variable 'ansible_timeout' from source: unknown 41684 1727204453.49354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204453.49452: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204453.49466: variable 'omit' from source: magic vars 41684 1727204453.49475: starting attempt loop 41684 1727204453.49482: running the handler 41684 1727204453.49491: _low_level_execute_command(): starting 41684 1727204453.49499: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204453.51020: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204453.51048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.51069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.51097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.51147: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.51198: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204453.51212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.51230: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204453.51326: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204453.51338: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204453.51351: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.51368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.51385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.51398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.51409: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204453.51429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.51507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204453.51532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204453.51554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204453.51639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204453.53192: stdout chunk (state=3): >>>/root <<< 41684 1727204453.53292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204453.53397: stderr chunk (state=3): >>><<< 41684 1727204453.53415: stdout chunk (state=3): >>><<< 41684 1727204453.53544: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204453.53553: _low_level_execute_command(): starting 41684 1727204453.53556: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204453.534466-42905-13458628899847 `" && echo ansible-tmp-1727204453.534466-42905-13458628899847="` echo /root/.ansible/tmp/ansible-tmp-1727204453.534466-42905-13458628899847 `" ) && sleep 0' 41684 1727204453.54212: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204453.54225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.54242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.54260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.54306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.54321: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204453.54334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.54350: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204453.54362: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204453.54378: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204453.54394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.54413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.54434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.54452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.54467: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204453.54485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.54570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204453.54597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204453.54619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204453.54709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204453.56537: stdout chunk (state=3): >>>ansible-tmp-1727204453.534466-42905-13458628899847=/root/.ansible/tmp/ansible-tmp-1727204453.534466-42905-13458628899847 <<< 41684 1727204453.56754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204453.56757: stdout chunk (state=3): >>><<< 41684 1727204453.56760: stderr chunk (state=3): >>><<< 41684 1727204453.57015: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204453.534466-42905-13458628899847=/root/.ansible/tmp/ansible-tmp-1727204453.534466-42905-13458628899847 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204453.57018: variable 'ansible_module_compression' from source: unknown 41684 1727204453.57021: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204453.57023: variable 'ansible_facts' from source: unknown 41684 1727204453.57025: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204453.534466-42905-13458628899847/AnsiballZ_command.py 41684 1727204453.57090: Sending initial data 41684 1727204453.57093: Sent initial data (154 bytes) 41684 1727204453.58124: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204453.58144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.58161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.58184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.58229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.58242: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204453.58257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.58277: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204453.58290: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204453.58302: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204453.58315: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.58330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.58346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.58361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.58376: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204453.58391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.58471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204453.58495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204453.58513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204453.58607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204453.60311: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204453.60361: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204453.60409: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpy3y2kbi0 /root/.ansible/tmp/ansible-tmp-1727204453.534466-42905-13458628899847/AnsiballZ_command.py <<< 41684 1727204453.60465: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204453.61772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204453.61914: stderr chunk (state=3): >>><<< 41684 1727204453.61918: stdout chunk (state=3): >>><<< 41684 1727204453.61920: done transferring module to remote 41684 1727204453.61923: _low_level_execute_command(): starting 41684 1727204453.61925: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204453.534466-42905-13458628899847/ /root/.ansible/tmp/ansible-tmp-1727204453.534466-42905-13458628899847/AnsiballZ_command.py && sleep 0' 41684 1727204453.62556: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204453.62578: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.62594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.62614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.62657: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.62672: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204453.62691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.62710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204453.62722: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204453.62734: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204453.62746: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.62759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.62779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.62795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.62807: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204453.62822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.62904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204453.62921: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204453.62935: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204453.63025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204453.64733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204453.64796: stderr chunk (state=3): >>><<< 41684 1727204453.64798: stdout chunk (state=3): >>><<< 41684 1727204453.64866: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204453.64870: _low_level_execute_command(): starting 41684 1727204453.64873: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204453.534466-42905-13458628899847/AnsiballZ_command.py && sleep 0' 41684 1727204453.65450: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204453.65454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.65481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.65490: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204453.65499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.65511: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204453.65518: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204453.65525: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204453.65537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.65555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.65578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.65588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.65646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204453.65649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204453.65654: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204453.65712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204453.79246: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 15:00:53.785888", "end": "2024-09-24 15:00:53.791677", "delta": "0:00:00.005789", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204453.80460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204453.80469: stdout chunk (state=3): >>><<< 41684 1727204453.80472: stderr chunk (state=3): >>><<< 41684 1727204453.80570: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 15:00:53.785888", "end": "2024-09-24 15:00:53.791677", "delta": "0:00:00.005789", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204453.80579: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204453.534466-42905-13458628899847/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204453.80584: _low_level_execute_command(): starting 41684 1727204453.80586: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204453.534466-42905-13458628899847/ > /dev/null 2>&1 && sleep 0' 41684 1727204453.81314: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204453.81331: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.81347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.81379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.81428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.81442: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204453.81456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.81483: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204453.81493: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204453.81502: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204453.81515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.81526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.81539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.81548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.81557: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204453.81575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.81657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204453.81684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204453.81703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204453.81791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204453.83568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204453.83653: stderr chunk (state=3): >>><<< 41684 1727204453.83667: stdout chunk (state=3): >>><<< 41684 1727204453.83691: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204453.83694: handler run complete 41684 1727204453.83717: Evaluated conditional (False): False 41684 1727204453.83726: attempt loop complete, returning result 41684 1727204453.83745: variable 'item' from source: unknown 41684 1727204453.83833: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.005789", "end": "2024-09-24 15:00:53.791677", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-24 15:00:53.785888" } 41684 1727204453.83959: dumping result to json 41684 1727204453.83966: done dumping result, returning 41684 1727204453.83969: done running TaskExecutor() for managed-node1/TASK: Create veth interface ethtest0 [0affcd87-79f5-3839-086d-00000000016e] 41684 1727204453.83970: sending task result for task 0affcd87-79f5-3839-086d-00000000016e 41684 1727204453.84017: done sending task result for task 0affcd87-79f5-3839-086d-00000000016e 41684 1727204453.84020: WORKER PROCESS EXITING 41684 1727204453.84082: no more pending results, returning what we have 41684 1727204453.84086: results queue empty 41684 1727204453.84086: checking for any_errors_fatal 41684 1727204453.84092: done checking for any_errors_fatal 41684 1727204453.84093: checking for max_fail_percentage 41684 1727204453.84094: done checking for max_fail_percentage 41684 1727204453.84095: checking to see if all hosts have failed and the running result is not ok 41684 1727204453.84096: done checking to see if all hosts have failed 41684 1727204453.84096: getting the remaining hosts for this loop 41684 1727204453.84098: done getting the remaining hosts for this loop 41684 1727204453.84101: getting the next task for host managed-node1 41684 1727204453.84107: done getting next task for host managed-node1 41684 1727204453.84110: ^ task is: TASK: Set up veth as managed by NetworkManager 41684 1727204453.84113: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204453.84115: getting variables 41684 1727204453.84117: in VariableManager get_vars() 41684 1727204453.84159: Calling all_inventory to load vars for managed-node1 41684 1727204453.84166: Calling groups_inventory to load vars for managed-node1 41684 1727204453.84168: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204453.84178: Calling all_plugins_play to load vars for managed-node1 41684 1727204453.84180: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204453.84183: Calling groups_plugins_play to load vars for managed-node1 41684 1727204453.84369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204453.84628: done with get_vars() 41684 1727204453.84640: done getting variables 41684 1727204453.84707: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 15:00:53 -0400 (0:00:01.217) 0:00:10.248 ***** 41684 1727204453.84734: entering _queue_task() for managed-node1/command 41684 1727204453.85010: worker is 1 (out of 1 available) 41684 1727204453.85023: exiting _queue_task() for managed-node1/command 41684 1727204453.85036: done queuing things up, now waiting for results queue to drain 41684 1727204453.85037: waiting for pending results... 41684 1727204453.85308: running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager 41684 1727204453.85419: in run() - task 0affcd87-79f5-3839-086d-00000000016f 41684 1727204453.85438: variable 'ansible_search_path' from source: unknown 41684 1727204453.85444: variable 'ansible_search_path' from source: unknown 41684 1727204453.85490: calling self._execute() 41684 1727204453.85574: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204453.85585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204453.85601: variable 'omit' from source: magic vars 41684 1727204453.86032: variable 'ansible_distribution_major_version' from source: facts 41684 1727204453.86052: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204453.86282: variable 'type' from source: set_fact 41684 1727204453.86292: variable 'state' from source: include params 41684 1727204453.86302: Evaluated conditional (type == 'veth' and state == 'present'): True 41684 1727204453.86312: variable 'omit' from source: magic vars 41684 1727204453.86358: variable 'omit' from source: magic vars 41684 1727204453.86473: variable 'interface' from source: set_fact 41684 1727204453.86496: variable 'omit' from source: magic vars 41684 1727204453.86544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204453.86596: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204453.86637: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204453.86661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204453.86686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204453.86720: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204453.86729: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204453.86737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204453.86872: Set connection var ansible_connection to ssh 41684 1727204453.86885: Set connection var ansible_pipelining to False 41684 1727204453.86899: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204453.86909: Set connection var ansible_timeout to 10 41684 1727204453.86920: Set connection var ansible_shell_executable to /bin/sh 41684 1727204453.86927: Set connection var ansible_shell_type to sh 41684 1727204453.86956: variable 'ansible_shell_executable' from source: unknown 41684 1727204453.86970: variable 'ansible_connection' from source: unknown 41684 1727204453.86978: variable 'ansible_module_compression' from source: unknown 41684 1727204453.86985: variable 'ansible_shell_type' from source: unknown 41684 1727204453.86992: variable 'ansible_shell_executable' from source: unknown 41684 1727204453.87000: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204453.87011: variable 'ansible_pipelining' from source: unknown 41684 1727204453.87019: variable 'ansible_timeout' from source: unknown 41684 1727204453.87026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204453.87181: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204453.87198: variable 'omit' from source: magic vars 41684 1727204453.87208: starting attempt loop 41684 1727204453.87214: running the handler 41684 1727204453.87237: _low_level_execute_command(): starting 41684 1727204453.87250: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204453.88028: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204453.88047: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.88069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.88088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.88128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.88143: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204453.88156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.88179: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204453.88190: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204453.88202: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204453.88216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.88230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.88251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.88270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.88283: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204453.88298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.88382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204453.88409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204453.88427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204453.88522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204453.90061: stdout chunk (state=3): >>>/root <<< 41684 1727204453.90168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204453.90269: stderr chunk (state=3): >>><<< 41684 1727204453.90286: stdout chunk (state=3): >>><<< 41684 1727204453.90372: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204453.90376: _low_level_execute_command(): starting 41684 1727204453.90387: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204453.903258-42964-53080636136056 `" && echo ansible-tmp-1727204453.903258-42964-53080636136056="` echo /root/.ansible/tmp/ansible-tmp-1727204453.903258-42964-53080636136056 `" ) && sleep 0' 41684 1727204453.91075: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204453.91093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.91112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.91132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.91183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.91196: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204453.91211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.91235: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204453.91248: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204453.91260: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204453.91278: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.91293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.91308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.91319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.91334: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204453.91347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.91428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204453.91457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204453.91479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204453.91577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204453.93420: stdout chunk (state=3): >>>ansible-tmp-1727204453.903258-42964-53080636136056=/root/.ansible/tmp/ansible-tmp-1727204453.903258-42964-53080636136056 <<< 41684 1727204453.93631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204453.93635: stdout chunk (state=3): >>><<< 41684 1727204453.93638: stderr chunk (state=3): >>><<< 41684 1727204453.93974: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204453.903258-42964-53080636136056=/root/.ansible/tmp/ansible-tmp-1727204453.903258-42964-53080636136056 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204453.93978: variable 'ansible_module_compression' from source: unknown 41684 1727204453.93981: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204453.93983: variable 'ansible_facts' from source: unknown 41684 1727204453.93985: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204453.903258-42964-53080636136056/AnsiballZ_command.py 41684 1727204453.94054: Sending initial data 41684 1727204453.94057: Sent initial data (154 bytes) 41684 1727204453.95112: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204453.95127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.95141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.95159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.95212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.95223: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204453.95237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.95255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204453.95269: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204453.95283: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204453.95299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.95311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.95325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.95337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.95347: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204453.95359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.95445: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204453.95472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204453.95490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204453.95590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204453.97303: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204453.97355: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204453.97405: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpd91ko5u6 /root/.ansible/tmp/ansible-tmp-1727204453.903258-42964-53080636136056/AnsiballZ_command.py <<< 41684 1727204453.97456: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204453.98695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204453.98904: stderr chunk (state=3): >>><<< 41684 1727204453.98908: stdout chunk (state=3): >>><<< 41684 1727204453.98910: done transferring module to remote 41684 1727204453.98912: _low_level_execute_command(): starting 41684 1727204453.98915: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204453.903258-42964-53080636136056/ /root/.ansible/tmp/ansible-tmp-1727204453.903258-42964-53080636136056/AnsiballZ_command.py && sleep 0' 41684 1727204453.99553: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204453.99579: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.99595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.99612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.99657: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.99677: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204453.99695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.99712: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204453.99723: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204453.99733: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204453.99744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204453.99757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204453.99781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204453.99799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204453.99811: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204453.99824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204453.99907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204453.99927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204453.99940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204454.00028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204454.01795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204454.01845: stderr chunk (state=3): >>><<< 41684 1727204454.01849: stdout chunk (state=3): >>><<< 41684 1727204454.01952: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204454.01957: _low_level_execute_command(): starting 41684 1727204454.01959: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204453.903258-42964-53080636136056/AnsiballZ_command.py && sleep 0' 41684 1727204454.02432: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204454.02435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204454.02476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.02480: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204454.02484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.02535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204454.02539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204454.02544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204454.02604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204454.17590: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 15:00:54.155369", "end": "2024-09-24 15:00:54.174907", "delta": "0:00:00.019538", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204454.18790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204454.18849: stderr chunk (state=3): >>><<< 41684 1727204454.18853: stdout chunk (state=3): >>><<< 41684 1727204454.18873: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 15:00:54.155369", "end": "2024-09-24 15:00:54.174907", "delta": "0:00:00.019538", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204454.18902: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204453.903258-42964-53080636136056/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204454.18909: _low_level_execute_command(): starting 41684 1727204454.18914: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204453.903258-42964-53080636136056/ > /dev/null 2>&1 && sleep 0' 41684 1727204454.19411: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204454.19414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204454.19450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.19453: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204454.19455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204454.19457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.19506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204454.19518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204454.19582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204454.21345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204454.21575: stderr chunk (state=3): >>><<< 41684 1727204454.21579: stdout chunk (state=3): >>><<< 41684 1727204454.21582: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204454.21585: handler run complete 41684 1727204454.21587: Evaluated conditional (False): False 41684 1727204454.21589: attempt loop complete, returning result 41684 1727204454.21591: _execute() done 41684 1727204454.21593: dumping result to json 41684 1727204454.21595: done dumping result, returning 41684 1727204454.21597: done running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager [0affcd87-79f5-3839-086d-00000000016f] 41684 1727204454.21599: sending task result for task 0affcd87-79f5-3839-086d-00000000016f 41684 1727204454.21678: done sending task result for task 0affcd87-79f5-3839-086d-00000000016f 41684 1727204454.21682: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.019538", "end": "2024-09-24 15:00:54.174907", "rc": 0, "start": "2024-09-24 15:00:54.155369" } 41684 1727204454.21750: no more pending results, returning what we have 41684 1727204454.21754: results queue empty 41684 1727204454.21755: checking for any_errors_fatal 41684 1727204454.21769: done checking for any_errors_fatal 41684 1727204454.21770: checking for max_fail_percentage 41684 1727204454.21772: done checking for max_fail_percentage 41684 1727204454.21772: checking to see if all hosts have failed and the running result is not ok 41684 1727204454.21773: done checking to see if all hosts have failed 41684 1727204454.21774: getting the remaining hosts for this loop 41684 1727204454.21776: done getting the remaining hosts for this loop 41684 1727204454.21780: getting the next task for host managed-node1 41684 1727204454.21786: done getting next task for host managed-node1 41684 1727204454.21789: ^ task is: TASK: Delete veth interface {{ interface }} 41684 1727204454.21792: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204454.21796: getting variables 41684 1727204454.21798: in VariableManager get_vars() 41684 1727204454.21844: Calling all_inventory to load vars for managed-node1 41684 1727204454.21846: Calling groups_inventory to load vars for managed-node1 41684 1727204454.21849: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204454.21861: Calling all_plugins_play to load vars for managed-node1 41684 1727204454.21865: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204454.21869: Calling groups_plugins_play to load vars for managed-node1 41684 1727204454.22279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204454.22583: done with get_vars() 41684 1727204454.22597: done getting variables 41684 1727204454.22669: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204454.22806: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.381) 0:00:10.629 ***** 41684 1727204454.22846: entering _queue_task() for managed-node1/command 41684 1727204454.23141: worker is 1 (out of 1 available) 41684 1727204454.23154: exiting _queue_task() for managed-node1/command 41684 1727204454.23174: done queuing things up, now waiting for results queue to drain 41684 1727204454.23176: waiting for pending results... 41684 1727204454.23452: running TaskExecutor() for managed-node1/TASK: Delete veth interface ethtest0 41684 1727204454.23573: in run() - task 0affcd87-79f5-3839-086d-000000000170 41684 1727204454.23596: variable 'ansible_search_path' from source: unknown 41684 1727204454.23613: variable 'ansible_search_path' from source: unknown 41684 1727204454.23659: calling self._execute() 41684 1727204454.23766: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.23780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.23796: variable 'omit' from source: magic vars 41684 1727204454.24144: variable 'ansible_distribution_major_version' from source: facts 41684 1727204454.24159: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204454.24293: variable 'type' from source: set_fact 41684 1727204454.24296: variable 'state' from source: include params 41684 1727204454.24300: variable 'interface' from source: set_fact 41684 1727204454.24304: variable 'current_interfaces' from source: set_fact 41684 1727204454.24311: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 41684 1727204454.24314: when evaluation is False, skipping this task 41684 1727204454.24316: _execute() done 41684 1727204454.24318: dumping result to json 41684 1727204454.24320: done dumping result, returning 41684 1727204454.24327: done running TaskExecutor() for managed-node1/TASK: Delete veth interface ethtest0 [0affcd87-79f5-3839-086d-000000000170] 41684 1727204454.24333: sending task result for task 0affcd87-79f5-3839-086d-000000000170 41684 1727204454.24422: done sending task result for task 0affcd87-79f5-3839-086d-000000000170 41684 1727204454.24424: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 41684 1727204454.24471: no more pending results, returning what we have 41684 1727204454.24477: results queue empty 41684 1727204454.24478: checking for any_errors_fatal 41684 1727204454.24486: done checking for any_errors_fatal 41684 1727204454.24486: checking for max_fail_percentage 41684 1727204454.24488: done checking for max_fail_percentage 41684 1727204454.24488: checking to see if all hosts have failed and the running result is not ok 41684 1727204454.24489: done checking to see if all hosts have failed 41684 1727204454.24490: getting the remaining hosts for this loop 41684 1727204454.24491: done getting the remaining hosts for this loop 41684 1727204454.24495: getting the next task for host managed-node1 41684 1727204454.24501: done getting next task for host managed-node1 41684 1727204454.24504: ^ task is: TASK: Create dummy interface {{ interface }} 41684 1727204454.24507: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204454.24511: getting variables 41684 1727204454.24512: in VariableManager get_vars() 41684 1727204454.24549: Calling all_inventory to load vars for managed-node1 41684 1727204454.24552: Calling groups_inventory to load vars for managed-node1 41684 1727204454.24554: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204454.24568: Calling all_plugins_play to load vars for managed-node1 41684 1727204454.24570: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204454.24573: Calling groups_plugins_play to load vars for managed-node1 41684 1727204454.24702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204454.24879: done with get_vars() 41684 1727204454.24887: done getting variables 41684 1727204454.24931: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204454.25013: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.021) 0:00:10.651 ***** 41684 1727204454.25036: entering _queue_task() for managed-node1/command 41684 1727204454.25230: worker is 1 (out of 1 available) 41684 1727204454.25244: exiting _queue_task() for managed-node1/command 41684 1727204454.25257: done queuing things up, now waiting for results queue to drain 41684 1727204454.25259: waiting for pending results... 41684 1727204454.25417: running TaskExecutor() for managed-node1/TASK: Create dummy interface ethtest0 41684 1727204454.25488: in run() - task 0affcd87-79f5-3839-086d-000000000171 41684 1727204454.25499: variable 'ansible_search_path' from source: unknown 41684 1727204454.25502: variable 'ansible_search_path' from source: unknown 41684 1727204454.25531: calling self._execute() 41684 1727204454.25595: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.25600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.25608: variable 'omit' from source: magic vars 41684 1727204454.25860: variable 'ansible_distribution_major_version' from source: facts 41684 1727204454.25873: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204454.26006: variable 'type' from source: set_fact 41684 1727204454.26014: variable 'state' from source: include params 41684 1727204454.26022: variable 'interface' from source: set_fact 41684 1727204454.26025: variable 'current_interfaces' from source: set_fact 41684 1727204454.26033: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 41684 1727204454.26036: when evaluation is False, skipping this task 41684 1727204454.26038: _execute() done 41684 1727204454.26042: dumping result to json 41684 1727204454.26044: done dumping result, returning 41684 1727204454.26050: done running TaskExecutor() for managed-node1/TASK: Create dummy interface ethtest0 [0affcd87-79f5-3839-086d-000000000171] 41684 1727204454.26060: sending task result for task 0affcd87-79f5-3839-086d-000000000171 41684 1727204454.26138: done sending task result for task 0affcd87-79f5-3839-086d-000000000171 41684 1727204454.26140: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 41684 1727204454.26199: no more pending results, returning what we have 41684 1727204454.26203: results queue empty 41684 1727204454.26204: checking for any_errors_fatal 41684 1727204454.26212: done checking for any_errors_fatal 41684 1727204454.26212: checking for max_fail_percentage 41684 1727204454.26214: done checking for max_fail_percentage 41684 1727204454.26214: checking to see if all hosts have failed and the running result is not ok 41684 1727204454.26215: done checking to see if all hosts have failed 41684 1727204454.26216: getting the remaining hosts for this loop 41684 1727204454.26217: done getting the remaining hosts for this loop 41684 1727204454.26220: getting the next task for host managed-node1 41684 1727204454.26226: done getting next task for host managed-node1 41684 1727204454.26228: ^ task is: TASK: Delete dummy interface {{ interface }} 41684 1727204454.26231: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204454.26234: getting variables 41684 1727204454.26235: in VariableManager get_vars() 41684 1727204454.26270: Calling all_inventory to load vars for managed-node1 41684 1727204454.26273: Calling groups_inventory to load vars for managed-node1 41684 1727204454.26275: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204454.26284: Calling all_plugins_play to load vars for managed-node1 41684 1727204454.26286: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204454.26288: Calling groups_plugins_play to load vars for managed-node1 41684 1727204454.26409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204454.26542: done with get_vars() 41684 1727204454.26551: done getting variables 41684 1727204454.26597: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204454.26681: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.016) 0:00:10.668 ***** 41684 1727204454.26701: entering _queue_task() for managed-node1/command 41684 1727204454.26892: worker is 1 (out of 1 available) 41684 1727204454.26906: exiting _queue_task() for managed-node1/command 41684 1727204454.26918: done queuing things up, now waiting for results queue to drain 41684 1727204454.26919: waiting for pending results... 41684 1727204454.27078: running TaskExecutor() for managed-node1/TASK: Delete dummy interface ethtest0 41684 1727204454.27144: in run() - task 0affcd87-79f5-3839-086d-000000000172 41684 1727204454.27154: variable 'ansible_search_path' from source: unknown 41684 1727204454.27158: variable 'ansible_search_path' from source: unknown 41684 1727204454.27190: calling self._execute() 41684 1727204454.27257: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.27260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.27272: variable 'omit' from source: magic vars 41684 1727204454.27532: variable 'ansible_distribution_major_version' from source: facts 41684 1727204454.27542: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204454.27686: variable 'type' from source: set_fact 41684 1727204454.27690: variable 'state' from source: include params 41684 1727204454.27694: variable 'interface' from source: set_fact 41684 1727204454.27697: variable 'current_interfaces' from source: set_fact 41684 1727204454.27704: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 41684 1727204454.27706: when evaluation is False, skipping this task 41684 1727204454.27709: _execute() done 41684 1727204454.27711: dumping result to json 41684 1727204454.27714: done dumping result, returning 41684 1727204454.27720: done running TaskExecutor() for managed-node1/TASK: Delete dummy interface ethtest0 [0affcd87-79f5-3839-086d-000000000172] 41684 1727204454.27726: sending task result for task 0affcd87-79f5-3839-086d-000000000172 41684 1727204454.27808: done sending task result for task 0affcd87-79f5-3839-086d-000000000172 41684 1727204454.27811: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 41684 1727204454.27857: no more pending results, returning what we have 41684 1727204454.27861: results queue empty 41684 1727204454.27866: checking for any_errors_fatal 41684 1727204454.27873: done checking for any_errors_fatal 41684 1727204454.27874: checking for max_fail_percentage 41684 1727204454.27875: done checking for max_fail_percentage 41684 1727204454.27876: checking to see if all hosts have failed and the running result is not ok 41684 1727204454.27876: done checking to see if all hosts have failed 41684 1727204454.27877: getting the remaining hosts for this loop 41684 1727204454.27879: done getting the remaining hosts for this loop 41684 1727204454.27883: getting the next task for host managed-node1 41684 1727204454.27889: done getting next task for host managed-node1 41684 1727204454.27891: ^ task is: TASK: Create tap interface {{ interface }} 41684 1727204454.27894: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204454.27898: getting variables 41684 1727204454.27899: in VariableManager get_vars() 41684 1727204454.27941: Calling all_inventory to load vars for managed-node1 41684 1727204454.27943: Calling groups_inventory to load vars for managed-node1 41684 1727204454.27945: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204454.27955: Calling all_plugins_play to load vars for managed-node1 41684 1727204454.27957: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204454.27959: Calling groups_plugins_play to load vars for managed-node1 41684 1727204454.28128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204454.28256: done with get_vars() 41684 1727204454.28267: done getting variables 41684 1727204454.28309: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204454.28390: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.017) 0:00:10.685 ***** 41684 1727204454.28412: entering _queue_task() for managed-node1/command 41684 1727204454.28608: worker is 1 (out of 1 available) 41684 1727204454.28621: exiting _queue_task() for managed-node1/command 41684 1727204454.28633: done queuing things up, now waiting for results queue to drain 41684 1727204454.28634: waiting for pending results... 41684 1727204454.28788: running TaskExecutor() for managed-node1/TASK: Create tap interface ethtest0 41684 1727204454.28851: in run() - task 0affcd87-79f5-3839-086d-000000000173 41684 1727204454.28866: variable 'ansible_search_path' from source: unknown 41684 1727204454.28870: variable 'ansible_search_path' from source: unknown 41684 1727204454.28896: calling self._execute() 41684 1727204454.28959: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.28968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.28973: variable 'omit' from source: magic vars 41684 1727204454.29239: variable 'ansible_distribution_major_version' from source: facts 41684 1727204454.29251: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204454.29390: variable 'type' from source: set_fact 41684 1727204454.29394: variable 'state' from source: include params 41684 1727204454.29397: variable 'interface' from source: set_fact 41684 1727204454.29402: variable 'current_interfaces' from source: set_fact 41684 1727204454.29408: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 41684 1727204454.29414: when evaluation is False, skipping this task 41684 1727204454.29417: _execute() done 41684 1727204454.29420: dumping result to json 41684 1727204454.29422: done dumping result, returning 41684 1727204454.29428: done running TaskExecutor() for managed-node1/TASK: Create tap interface ethtest0 [0affcd87-79f5-3839-086d-000000000173] 41684 1727204454.29434: sending task result for task 0affcd87-79f5-3839-086d-000000000173 41684 1727204454.29515: done sending task result for task 0affcd87-79f5-3839-086d-000000000173 41684 1727204454.29517: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 41684 1727204454.29572: no more pending results, returning what we have 41684 1727204454.29575: results queue empty 41684 1727204454.29576: checking for any_errors_fatal 41684 1727204454.29584: done checking for any_errors_fatal 41684 1727204454.29584: checking for max_fail_percentage 41684 1727204454.29586: done checking for max_fail_percentage 41684 1727204454.29586: checking to see if all hosts have failed and the running result is not ok 41684 1727204454.29587: done checking to see if all hosts have failed 41684 1727204454.29588: getting the remaining hosts for this loop 41684 1727204454.29589: done getting the remaining hosts for this loop 41684 1727204454.29593: getting the next task for host managed-node1 41684 1727204454.29598: done getting next task for host managed-node1 41684 1727204454.29601: ^ task is: TASK: Delete tap interface {{ interface }} 41684 1727204454.29603: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204454.29606: getting variables 41684 1727204454.29608: in VariableManager get_vars() 41684 1727204454.29645: Calling all_inventory to load vars for managed-node1 41684 1727204454.29648: Calling groups_inventory to load vars for managed-node1 41684 1727204454.29650: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204454.29659: Calling all_plugins_play to load vars for managed-node1 41684 1727204454.29661: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204454.29666: Calling groups_plugins_play to load vars for managed-node1 41684 1727204454.29785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204454.29913: done with get_vars() 41684 1727204454.29920: done getting variables 41684 1727204454.29962: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204454.30041: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.016) 0:00:10.702 ***** 41684 1727204454.30062: entering _queue_task() for managed-node1/command 41684 1727204454.30253: worker is 1 (out of 1 available) 41684 1727204454.30268: exiting _queue_task() for managed-node1/command 41684 1727204454.30281: done queuing things up, now waiting for results queue to drain 41684 1727204454.30282: waiting for pending results... 41684 1727204454.30435: running TaskExecutor() for managed-node1/TASK: Delete tap interface ethtest0 41684 1727204454.30496: in run() - task 0affcd87-79f5-3839-086d-000000000174 41684 1727204454.30508: variable 'ansible_search_path' from source: unknown 41684 1727204454.30511: variable 'ansible_search_path' from source: unknown 41684 1727204454.30540: calling self._execute() 41684 1727204454.30606: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.30611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.30619: variable 'omit' from source: magic vars 41684 1727204454.30873: variable 'ansible_distribution_major_version' from source: facts 41684 1727204454.30885: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204454.31019: variable 'type' from source: set_fact 41684 1727204454.31022: variable 'state' from source: include params 41684 1727204454.31025: variable 'interface' from source: set_fact 41684 1727204454.31030: variable 'current_interfaces' from source: set_fact 41684 1727204454.31036: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 41684 1727204454.31039: when evaluation is False, skipping this task 41684 1727204454.31041: _execute() done 41684 1727204454.31044: dumping result to json 41684 1727204454.31046: done dumping result, returning 41684 1727204454.31051: done running TaskExecutor() for managed-node1/TASK: Delete tap interface ethtest0 [0affcd87-79f5-3839-086d-000000000174] 41684 1727204454.31068: sending task result for task 0affcd87-79f5-3839-086d-000000000174 41684 1727204454.31146: done sending task result for task 0affcd87-79f5-3839-086d-000000000174 41684 1727204454.31149: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 41684 1727204454.31219: no more pending results, returning what we have 41684 1727204454.31223: results queue empty 41684 1727204454.31224: checking for any_errors_fatal 41684 1727204454.31229: done checking for any_errors_fatal 41684 1727204454.31230: checking for max_fail_percentage 41684 1727204454.31231: done checking for max_fail_percentage 41684 1727204454.31232: checking to see if all hosts have failed and the running result is not ok 41684 1727204454.31233: done checking to see if all hosts have failed 41684 1727204454.31233: getting the remaining hosts for this loop 41684 1727204454.31234: done getting the remaining hosts for this loop 41684 1727204454.31238: getting the next task for host managed-node1 41684 1727204454.31244: done getting next task for host managed-node1 41684 1727204454.31247: ^ task is: TASK: Assert device is present 41684 1727204454.31250: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204454.31253: getting variables 41684 1727204454.31254: in VariableManager get_vars() 41684 1727204454.31294: Calling all_inventory to load vars for managed-node1 41684 1727204454.31296: Calling groups_inventory to load vars for managed-node1 41684 1727204454.31298: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204454.31305: Calling all_plugins_play to load vars for managed-node1 41684 1727204454.31306: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204454.31308: Calling groups_plugins_play to load vars for managed-node1 41684 1727204454.31461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204454.31589: done with get_vars() 41684 1727204454.31597: done getting variables TASK [Assert device is present] ************************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:21 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.015) 0:00:10.718 ***** 41684 1727204454.31659: entering _queue_task() for managed-node1/include_tasks 41684 1727204454.31846: worker is 1 (out of 1 available) 41684 1727204454.31859: exiting _queue_task() for managed-node1/include_tasks 41684 1727204454.31874: done queuing things up, now waiting for results queue to drain 41684 1727204454.31875: waiting for pending results... 41684 1727204454.32033: running TaskExecutor() for managed-node1/TASK: Assert device is present 41684 1727204454.32092: in run() - task 0affcd87-79f5-3839-086d-00000000000e 41684 1727204454.32104: variable 'ansible_search_path' from source: unknown 41684 1727204454.32133: calling self._execute() 41684 1727204454.32199: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.32204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.32212: variable 'omit' from source: magic vars 41684 1727204454.32473: variable 'ansible_distribution_major_version' from source: facts 41684 1727204454.32486: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204454.32491: _execute() done 41684 1727204454.32494: dumping result to json 41684 1727204454.32497: done dumping result, returning 41684 1727204454.32504: done running TaskExecutor() for managed-node1/TASK: Assert device is present [0affcd87-79f5-3839-086d-00000000000e] 41684 1727204454.32509: sending task result for task 0affcd87-79f5-3839-086d-00000000000e 41684 1727204454.32596: done sending task result for task 0affcd87-79f5-3839-086d-00000000000e 41684 1727204454.32599: WORKER PROCESS EXITING 41684 1727204454.32629: no more pending results, returning what we have 41684 1727204454.32633: in VariableManager get_vars() 41684 1727204454.32679: Calling all_inventory to load vars for managed-node1 41684 1727204454.32681: Calling groups_inventory to load vars for managed-node1 41684 1727204454.32683: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204454.32694: Calling all_plugins_play to load vars for managed-node1 41684 1727204454.32696: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204454.32698: Calling groups_plugins_play to load vars for managed-node1 41684 1727204454.32835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204454.32960: done with get_vars() 41684 1727204454.32968: variable 'ansible_search_path' from source: unknown 41684 1727204454.32977: we have included files to process 41684 1727204454.32978: generating all_blocks data 41684 1727204454.32979: done generating all_blocks data 41684 1727204454.32982: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 41684 1727204454.32983: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 41684 1727204454.32984: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 41684 1727204454.33091: in VariableManager get_vars() 41684 1727204454.33106: done with get_vars() 41684 1727204454.33183: done processing included file 41684 1727204454.33184: iterating over new_blocks loaded from include file 41684 1727204454.33185: in VariableManager get_vars() 41684 1727204454.33197: done with get_vars() 41684 1727204454.33198: filtering new block on tags 41684 1727204454.33209: done filtering new block on tags 41684 1727204454.33210: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node1 41684 1727204454.33214: extending task lists for all hosts with included blocks 41684 1727204454.33602: done extending task lists 41684 1727204454.33603: done processing included files 41684 1727204454.33604: results queue empty 41684 1727204454.33604: checking for any_errors_fatal 41684 1727204454.33607: done checking for any_errors_fatal 41684 1727204454.33607: checking for max_fail_percentage 41684 1727204454.33608: done checking for max_fail_percentage 41684 1727204454.33609: checking to see if all hosts have failed and the running result is not ok 41684 1727204454.33609: done checking to see if all hosts have failed 41684 1727204454.33610: getting the remaining hosts for this loop 41684 1727204454.33610: done getting the remaining hosts for this loop 41684 1727204454.33612: getting the next task for host managed-node1 41684 1727204454.33614: done getting next task for host managed-node1 41684 1727204454.33616: ^ task is: TASK: Include the task 'get_interface_stat.yml' 41684 1727204454.33618: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204454.33619: getting variables 41684 1727204454.33620: in VariableManager get_vars() 41684 1727204454.33628: Calling all_inventory to load vars for managed-node1 41684 1727204454.33630: Calling groups_inventory to load vars for managed-node1 41684 1727204454.33631: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204454.33635: Calling all_plugins_play to load vars for managed-node1 41684 1727204454.33636: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204454.33638: Calling groups_plugins_play to load vars for managed-node1 41684 1727204454.33728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204454.33845: done with get_vars() 41684 1727204454.33852: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.022) 0:00:10.740 ***** 41684 1727204454.33903: entering _queue_task() for managed-node1/include_tasks 41684 1727204454.34104: worker is 1 (out of 1 available) 41684 1727204454.34119: exiting _queue_task() for managed-node1/include_tasks 41684 1727204454.34132: done queuing things up, now waiting for results queue to drain 41684 1727204454.34134: waiting for pending results... 41684 1727204454.34296: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 41684 1727204454.34353: in run() - task 0affcd87-79f5-3839-086d-000000000214 41684 1727204454.34371: variable 'ansible_search_path' from source: unknown 41684 1727204454.34375: variable 'ansible_search_path' from source: unknown 41684 1727204454.34405: calling self._execute() 41684 1727204454.34468: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.34471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.34482: variable 'omit' from source: magic vars 41684 1727204454.34738: variable 'ansible_distribution_major_version' from source: facts 41684 1727204454.34748: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204454.34754: _execute() done 41684 1727204454.34757: dumping result to json 41684 1727204454.34765: done dumping result, returning 41684 1727204454.34768: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-3839-086d-000000000214] 41684 1727204454.34774: sending task result for task 0affcd87-79f5-3839-086d-000000000214 41684 1727204454.34855: done sending task result for task 0affcd87-79f5-3839-086d-000000000214 41684 1727204454.34857: WORKER PROCESS EXITING 41684 1727204454.34888: no more pending results, returning what we have 41684 1727204454.34899: in VariableManager get_vars() 41684 1727204454.34941: Calling all_inventory to load vars for managed-node1 41684 1727204454.34944: Calling groups_inventory to load vars for managed-node1 41684 1727204454.34946: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204454.34956: Calling all_plugins_play to load vars for managed-node1 41684 1727204454.34958: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204454.34960: Calling groups_plugins_play to load vars for managed-node1 41684 1727204454.35119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204454.35243: done with get_vars() 41684 1727204454.35249: variable 'ansible_search_path' from source: unknown 41684 1727204454.35249: variable 'ansible_search_path' from source: unknown 41684 1727204454.35277: we have included files to process 41684 1727204454.35278: generating all_blocks data 41684 1727204454.35279: done generating all_blocks data 41684 1727204454.35280: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41684 1727204454.35280: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41684 1727204454.35282: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41684 1727204454.35431: done processing included file 41684 1727204454.35432: iterating over new_blocks loaded from include file 41684 1727204454.35433: in VariableManager get_vars() 41684 1727204454.35447: done with get_vars() 41684 1727204454.35448: filtering new block on tags 41684 1727204454.35458: done filtering new block on tags 41684 1727204454.35459: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 41684 1727204454.35466: extending task lists for all hosts with included blocks 41684 1727204454.35521: done extending task lists 41684 1727204454.35522: done processing included files 41684 1727204454.35523: results queue empty 41684 1727204454.35523: checking for any_errors_fatal 41684 1727204454.35525: done checking for any_errors_fatal 41684 1727204454.35526: checking for max_fail_percentage 41684 1727204454.35527: done checking for max_fail_percentage 41684 1727204454.35527: checking to see if all hosts have failed and the running result is not ok 41684 1727204454.35528: done checking to see if all hosts have failed 41684 1727204454.35528: getting the remaining hosts for this loop 41684 1727204454.35529: done getting the remaining hosts for this loop 41684 1727204454.35530: getting the next task for host managed-node1 41684 1727204454.35533: done getting next task for host managed-node1 41684 1727204454.35534: ^ task is: TASK: Get stat for interface {{ interface }} 41684 1727204454.35536: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204454.35538: getting variables 41684 1727204454.35538: in VariableManager get_vars() 41684 1727204454.35547: Calling all_inventory to load vars for managed-node1 41684 1727204454.35549: Calling groups_inventory to load vars for managed-node1 41684 1727204454.35551: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204454.35554: Calling all_plugins_play to load vars for managed-node1 41684 1727204454.35556: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204454.35559: Calling groups_plugins_play to load vars for managed-node1 41684 1727204454.35646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204454.35770: done with get_vars() 41684 1727204454.35777: done getting variables 41684 1727204454.35890: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.020) 0:00:10.760 ***** 41684 1727204454.35911: entering _queue_task() for managed-node1/stat 41684 1727204454.36111: worker is 1 (out of 1 available) 41684 1727204454.36125: exiting _queue_task() for managed-node1/stat 41684 1727204454.36138: done queuing things up, now waiting for results queue to drain 41684 1727204454.36139: waiting for pending results... 41684 1727204454.36292: running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 41684 1727204454.36355: in run() - task 0affcd87-79f5-3839-086d-000000000267 41684 1727204454.36367: variable 'ansible_search_path' from source: unknown 41684 1727204454.36372: variable 'ansible_search_path' from source: unknown 41684 1727204454.36400: calling self._execute() 41684 1727204454.36457: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.36461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.36470: variable 'omit' from source: magic vars 41684 1727204454.36792: variable 'ansible_distribution_major_version' from source: facts 41684 1727204454.36802: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204454.36808: variable 'omit' from source: magic vars 41684 1727204454.36838: variable 'omit' from source: magic vars 41684 1727204454.36910: variable 'interface' from source: set_fact 41684 1727204454.36922: variable 'omit' from source: magic vars 41684 1727204454.36954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204454.36986: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204454.37003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204454.37016: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204454.37025: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204454.37047: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204454.37050: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.37054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.37124: Set connection var ansible_connection to ssh 41684 1727204454.37134: Set connection var ansible_pipelining to False 41684 1727204454.37139: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204454.37144: Set connection var ansible_timeout to 10 41684 1727204454.37150: Set connection var ansible_shell_executable to /bin/sh 41684 1727204454.37153: Set connection var ansible_shell_type to sh 41684 1727204454.37175: variable 'ansible_shell_executable' from source: unknown 41684 1727204454.37178: variable 'ansible_connection' from source: unknown 41684 1727204454.37181: variable 'ansible_module_compression' from source: unknown 41684 1727204454.37185: variable 'ansible_shell_type' from source: unknown 41684 1727204454.37187: variable 'ansible_shell_executable' from source: unknown 41684 1727204454.37189: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.37191: variable 'ansible_pipelining' from source: unknown 41684 1727204454.37195: variable 'ansible_timeout' from source: unknown 41684 1727204454.37197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.37342: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204454.37352: variable 'omit' from source: magic vars 41684 1727204454.37357: starting attempt loop 41684 1727204454.37360: running the handler 41684 1727204454.37374: _low_level_execute_command(): starting 41684 1727204454.37380: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204454.37908: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204454.37925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204454.37940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204454.37952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204454.37973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.38012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204454.38025: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204454.38094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204454.39647: stdout chunk (state=3): >>>/root <<< 41684 1727204454.39749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204454.39811: stderr chunk (state=3): >>><<< 41684 1727204454.39814: stdout chunk (state=3): >>><<< 41684 1727204454.39834: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204454.39845: _low_level_execute_command(): starting 41684 1727204454.39851: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204454.398339-43000-53733091353085 `" && echo ansible-tmp-1727204454.398339-43000-53733091353085="` echo /root/.ansible/tmp/ansible-tmp-1727204454.398339-43000-53733091353085 `" ) && sleep 0' 41684 1727204454.40312: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204454.40334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204454.40348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.40376: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.40412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204454.40424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204454.40493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204454.42352: stdout chunk (state=3): >>>ansible-tmp-1727204454.398339-43000-53733091353085=/root/.ansible/tmp/ansible-tmp-1727204454.398339-43000-53733091353085 <<< 41684 1727204454.42469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204454.42524: stderr chunk (state=3): >>><<< 41684 1727204454.42528: stdout chunk (state=3): >>><<< 41684 1727204454.42543: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204454.398339-43000-53733091353085=/root/.ansible/tmp/ansible-tmp-1727204454.398339-43000-53733091353085 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204454.42588: variable 'ansible_module_compression' from source: unknown 41684 1727204454.42637: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 41684 1727204454.42669: variable 'ansible_facts' from source: unknown 41684 1727204454.42730: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204454.398339-43000-53733091353085/AnsiballZ_stat.py 41684 1727204454.42844: Sending initial data 41684 1727204454.42847: Sent initial data (151 bytes) 41684 1727204454.43520: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204454.43524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204454.43561: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.43571: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204454.43573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.43621: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204454.43624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204454.43687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204454.45382: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 41684 1727204454.45386: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204454.45433: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204454.45488: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpwleo8iw3 /root/.ansible/tmp/ansible-tmp-1727204454.398339-43000-53733091353085/AnsiballZ_stat.py <<< 41684 1727204454.45541: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204454.46385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204454.46493: stderr chunk (state=3): >>><<< 41684 1727204454.46496: stdout chunk (state=3): >>><<< 41684 1727204454.46513: done transferring module to remote 41684 1727204454.46522: _low_level_execute_command(): starting 41684 1727204454.46527: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204454.398339-43000-53733091353085/ /root/.ansible/tmp/ansible-tmp-1727204454.398339-43000-53733091353085/AnsiballZ_stat.py && sleep 0' 41684 1727204454.46978: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204454.46991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204454.47010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204454.47029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204454.47041: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.47079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204454.47091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204454.47153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204454.48835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204454.48892: stderr chunk (state=3): >>><<< 41684 1727204454.48895: stdout chunk (state=3): >>><<< 41684 1727204454.48909: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204454.48913: _low_level_execute_command(): starting 41684 1727204454.48918: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204454.398339-43000-53733091353085/AnsiballZ_stat.py && sleep 0' 41684 1727204454.49367: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204454.49381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204454.49400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.49415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.49457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204454.49474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204454.49540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204454.62589: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28617, "dev": 21, "nlink": 1, "atime": 1727204452.9529593, "mtime": 1727204452.9529593, "ctime": 1727204452.9529593, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 41684 1727204454.63454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204454.63519: stderr chunk (state=3): >>><<< 41684 1727204454.63523: stdout chunk (state=3): >>><<< 41684 1727204454.63539: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28617, "dev": 21, "nlink": 1, "atime": 1727204452.9529593, "mtime": 1727204452.9529593, "ctime": 1727204452.9529593, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204454.63582: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204454.398339-43000-53733091353085/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204454.63593: _low_level_execute_command(): starting 41684 1727204454.63597: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204454.398339-43000-53733091353085/ > /dev/null 2>&1 && sleep 0' 41684 1727204454.64072: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204454.64086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204454.64101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204454.64112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204454.64135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.64177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204454.64188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204454.64250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204454.65994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204454.66051: stderr chunk (state=3): >>><<< 41684 1727204454.66055: stdout chunk (state=3): >>><<< 41684 1727204454.66076: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204454.66082: handler run complete 41684 1727204454.66116: attempt loop complete, returning result 41684 1727204454.66119: _execute() done 41684 1727204454.66122: dumping result to json 41684 1727204454.66127: done dumping result, returning 41684 1727204454.66134: done running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 [0affcd87-79f5-3839-086d-000000000267] 41684 1727204454.66139: sending task result for task 0affcd87-79f5-3839-086d-000000000267 41684 1727204454.66247: done sending task result for task 0affcd87-79f5-3839-086d-000000000267 41684 1727204454.66250: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204452.9529593, "block_size": 4096, "blocks": 0, "ctime": 1727204452.9529593, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28617, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1727204452.9529593, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 41684 1727204454.66338: no more pending results, returning what we have 41684 1727204454.66341: results queue empty 41684 1727204454.66342: checking for any_errors_fatal 41684 1727204454.66344: done checking for any_errors_fatal 41684 1727204454.66345: checking for max_fail_percentage 41684 1727204454.66346: done checking for max_fail_percentage 41684 1727204454.66347: checking to see if all hosts have failed and the running result is not ok 41684 1727204454.66348: done checking to see if all hosts have failed 41684 1727204454.66348: getting the remaining hosts for this loop 41684 1727204454.66350: done getting the remaining hosts for this loop 41684 1727204454.66354: getting the next task for host managed-node1 41684 1727204454.66368: done getting next task for host managed-node1 41684 1727204454.66374: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 41684 1727204454.66376: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204454.66380: getting variables 41684 1727204454.66381: in VariableManager get_vars() 41684 1727204454.66473: Calling all_inventory to load vars for managed-node1 41684 1727204454.66475: Calling groups_inventory to load vars for managed-node1 41684 1727204454.66476: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204454.66484: Calling all_plugins_play to load vars for managed-node1 41684 1727204454.66486: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204454.66488: Calling groups_plugins_play to load vars for managed-node1 41684 1727204454.66595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204454.66720: done with get_vars() 41684 1727204454.66728: done getting variables 41684 1727204454.66811: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 41684 1727204454.66932: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.310) 0:00:11.071 ***** 41684 1727204454.66961: entering _queue_task() for managed-node1/assert 41684 1727204454.66962: Creating lock for assert 41684 1727204454.67219: worker is 1 (out of 1 available) 41684 1727204454.67232: exiting _queue_task() for managed-node1/assert 41684 1727204454.67248: done queuing things up, now waiting for results queue to drain 41684 1727204454.67249: waiting for pending results... 41684 1727204454.67506: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'ethtest0' 41684 1727204454.67607: in run() - task 0affcd87-79f5-3839-086d-000000000215 41684 1727204454.67627: variable 'ansible_search_path' from source: unknown 41684 1727204454.67635: variable 'ansible_search_path' from source: unknown 41684 1727204454.67678: calling self._execute() 41684 1727204454.67778: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.67788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.67806: variable 'omit' from source: magic vars 41684 1727204454.68153: variable 'ansible_distribution_major_version' from source: facts 41684 1727204454.68175: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204454.68186: variable 'omit' from source: magic vars 41684 1727204454.68228: variable 'omit' from source: magic vars 41684 1727204454.68331: variable 'interface' from source: set_fact 41684 1727204454.68358: variable 'omit' from source: magic vars 41684 1727204454.68402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204454.68442: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204454.68474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204454.68497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204454.68512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204454.68545: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204454.68558: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.68571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.68674: Set connection var ansible_connection to ssh 41684 1727204454.68680: Set connection var ansible_pipelining to False 41684 1727204454.68685: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204454.68692: Set connection var ansible_timeout to 10 41684 1727204454.68698: Set connection var ansible_shell_executable to /bin/sh 41684 1727204454.68704: Set connection var ansible_shell_type to sh 41684 1727204454.68723: variable 'ansible_shell_executable' from source: unknown 41684 1727204454.68726: variable 'ansible_connection' from source: unknown 41684 1727204454.68728: variable 'ansible_module_compression' from source: unknown 41684 1727204454.68731: variable 'ansible_shell_type' from source: unknown 41684 1727204454.68736: variable 'ansible_shell_executable' from source: unknown 41684 1727204454.68741: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.68748: variable 'ansible_pipelining' from source: unknown 41684 1727204454.68751: variable 'ansible_timeout' from source: unknown 41684 1727204454.68755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.68856: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204454.68868: variable 'omit' from source: magic vars 41684 1727204454.68875: starting attempt loop 41684 1727204454.68878: running the handler 41684 1727204454.68978: variable 'interface_stat' from source: set_fact 41684 1727204454.68994: Evaluated conditional (interface_stat.stat.exists): True 41684 1727204454.68997: handler run complete 41684 1727204454.69009: attempt loop complete, returning result 41684 1727204454.69012: _execute() done 41684 1727204454.69014: dumping result to json 41684 1727204454.69017: done dumping result, returning 41684 1727204454.69028: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'ethtest0' [0affcd87-79f5-3839-086d-000000000215] 41684 1727204454.69033: sending task result for task 0affcd87-79f5-3839-086d-000000000215 41684 1727204454.69116: done sending task result for task 0affcd87-79f5-3839-086d-000000000215 41684 1727204454.69118: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 41684 1727204454.69165: no more pending results, returning what we have 41684 1727204454.69169: results queue empty 41684 1727204454.69170: checking for any_errors_fatal 41684 1727204454.69179: done checking for any_errors_fatal 41684 1727204454.69180: checking for max_fail_percentage 41684 1727204454.69181: done checking for max_fail_percentage 41684 1727204454.69182: checking to see if all hosts have failed and the running result is not ok 41684 1727204454.69183: done checking to see if all hosts have failed 41684 1727204454.69183: getting the remaining hosts for this loop 41684 1727204454.69185: done getting the remaining hosts for this loop 41684 1727204454.69189: getting the next task for host managed-node1 41684 1727204454.69195: done getting next task for host managed-node1 41684 1727204454.69198: ^ task is: TASK: Set interface1 41684 1727204454.69200: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204454.69203: getting variables 41684 1727204454.69205: in VariableManager get_vars() 41684 1727204454.69246: Calling all_inventory to load vars for managed-node1 41684 1727204454.69249: Calling groups_inventory to load vars for managed-node1 41684 1727204454.69251: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204454.69261: Calling all_plugins_play to load vars for managed-node1 41684 1727204454.69265: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204454.69268: Calling groups_plugins_play to load vars for managed-node1 41684 1727204454.69398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204454.69547: done with get_vars() 41684 1727204454.69555: done getting variables 41684 1727204454.69598: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set interface1] ********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:23 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.026) 0:00:11.097 ***** 41684 1727204454.69618: entering _queue_task() for managed-node1/set_fact 41684 1727204454.69820: worker is 1 (out of 1 available) 41684 1727204454.69833: exiting _queue_task() for managed-node1/set_fact 41684 1727204454.69847: done queuing things up, now waiting for results queue to drain 41684 1727204454.69848: waiting for pending results... 41684 1727204454.70013: running TaskExecutor() for managed-node1/TASK: Set interface1 41684 1727204454.70073: in run() - task 0affcd87-79f5-3839-086d-00000000000f 41684 1727204454.70087: variable 'ansible_search_path' from source: unknown 41684 1727204454.70116: calling self._execute() 41684 1727204454.70181: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.70186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.70194: variable 'omit' from source: magic vars 41684 1727204454.70497: variable 'ansible_distribution_major_version' from source: facts 41684 1727204454.70514: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204454.70523: variable 'omit' from source: magic vars 41684 1727204454.70552: variable 'omit' from source: magic vars 41684 1727204454.70585: variable 'interface1' from source: play vars 41684 1727204454.70666: variable 'interface1' from source: play vars 41684 1727204454.70689: variable 'omit' from source: magic vars 41684 1727204454.70730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204454.70772: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204454.70798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204454.70819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204454.70834: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204454.70871: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204454.70880: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.70887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.70987: Set connection var ansible_connection to ssh 41684 1727204454.70999: Set connection var ansible_pipelining to False 41684 1727204454.71009: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204454.71018: Set connection var ansible_timeout to 10 41684 1727204454.71029: Set connection var ansible_shell_executable to /bin/sh 41684 1727204454.71036: Set connection var ansible_shell_type to sh 41684 1727204454.71062: variable 'ansible_shell_executable' from source: unknown 41684 1727204454.71073: variable 'ansible_connection' from source: unknown 41684 1727204454.71080: variable 'ansible_module_compression' from source: unknown 41684 1727204454.71086: variable 'ansible_shell_type' from source: unknown 41684 1727204454.71092: variable 'ansible_shell_executable' from source: unknown 41684 1727204454.71098: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.71105: variable 'ansible_pipelining' from source: unknown 41684 1727204454.71111: variable 'ansible_timeout' from source: unknown 41684 1727204454.71118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.71268: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204454.71285: variable 'omit' from source: magic vars 41684 1727204454.71296: starting attempt loop 41684 1727204454.71302: running the handler 41684 1727204454.71316: handler run complete 41684 1727204454.71329: attempt loop complete, returning result 41684 1727204454.71335: _execute() done 41684 1727204454.71342: dumping result to json 41684 1727204454.71348: done dumping result, returning 41684 1727204454.71357: done running TaskExecutor() for managed-node1/TASK: Set interface1 [0affcd87-79f5-3839-086d-00000000000f] 41684 1727204454.71368: sending task result for task 0affcd87-79f5-3839-086d-00000000000f ok: [managed-node1] => { "ansible_facts": { "interface": "ethtest1" }, "changed": false } 41684 1727204454.71511: no more pending results, returning what we have 41684 1727204454.71514: results queue empty 41684 1727204454.71515: checking for any_errors_fatal 41684 1727204454.71522: done checking for any_errors_fatal 41684 1727204454.71523: checking for max_fail_percentage 41684 1727204454.71524: done checking for max_fail_percentage 41684 1727204454.71525: checking to see if all hosts have failed and the running result is not ok 41684 1727204454.71526: done checking to see if all hosts have failed 41684 1727204454.71526: getting the remaining hosts for this loop 41684 1727204454.71528: done getting the remaining hosts for this loop 41684 1727204454.71532: getting the next task for host managed-node1 41684 1727204454.71544: done getting next task for host managed-node1 41684 1727204454.71547: ^ task is: TASK: Show interfaces 41684 1727204454.71549: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204454.71552: getting variables 41684 1727204454.71554: in VariableManager get_vars() 41684 1727204454.71598: Calling all_inventory to load vars for managed-node1 41684 1727204454.71601: Calling groups_inventory to load vars for managed-node1 41684 1727204454.71603: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204454.71608: done sending task result for task 0affcd87-79f5-3839-086d-00000000000f 41684 1727204454.71611: WORKER PROCESS EXITING 41684 1727204454.71621: Calling all_plugins_play to load vars for managed-node1 41684 1727204454.71623: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204454.71626: Calling groups_plugins_play to load vars for managed-node1 41684 1727204454.71824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204454.72057: done with get_vars() 41684 1727204454.72072: done getting variables TASK [Show interfaces] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:26 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.028) 0:00:11.125 ***** 41684 1727204454.72427: entering _queue_task() for managed-node1/include_tasks 41684 1727204454.72677: worker is 1 (out of 1 available) 41684 1727204454.72690: exiting _queue_task() for managed-node1/include_tasks 41684 1727204454.72701: done queuing things up, now waiting for results queue to drain 41684 1727204454.72703: waiting for pending results... 41684 1727204454.72959: running TaskExecutor() for managed-node1/TASK: Show interfaces 41684 1727204454.73048: in run() - task 0affcd87-79f5-3839-086d-000000000010 41684 1727204454.73069: variable 'ansible_search_path' from source: unknown 41684 1727204454.73110: calling self._execute() 41684 1727204454.73198: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.73209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.73221: variable 'omit' from source: magic vars 41684 1727204454.73571: variable 'ansible_distribution_major_version' from source: facts 41684 1727204454.73592: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204454.73601: _execute() done 41684 1727204454.73609: dumping result to json 41684 1727204454.73616: done dumping result, returning 41684 1727204454.73625: done running TaskExecutor() for managed-node1/TASK: Show interfaces [0affcd87-79f5-3839-086d-000000000010] 41684 1727204454.73636: sending task result for task 0affcd87-79f5-3839-086d-000000000010 41684 1727204454.73755: no more pending results, returning what we have 41684 1727204454.73760: in VariableManager get_vars() 41684 1727204454.73812: Calling all_inventory to load vars for managed-node1 41684 1727204454.73815: Calling groups_inventory to load vars for managed-node1 41684 1727204454.73818: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204454.73832: Calling all_plugins_play to load vars for managed-node1 41684 1727204454.73834: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204454.73838: Calling groups_plugins_play to load vars for managed-node1 41684 1727204454.74085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204454.74280: done with get_vars() 41684 1727204454.74289: variable 'ansible_search_path' from source: unknown 41684 1727204454.74306: we have included files to process 41684 1727204454.74307: generating all_blocks data 41684 1727204454.74309: done generating all_blocks data 41684 1727204454.74317: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41684 1727204454.74318: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41684 1727204454.74322: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41684 1727204454.74674: in VariableManager get_vars() 41684 1727204454.74700: done with get_vars() 41684 1727204454.74733: done sending task result for task 0affcd87-79f5-3839-086d-000000000010 41684 1727204454.74736: WORKER PROCESS EXITING 41684 1727204454.74826: done processing included file 41684 1727204454.74828: iterating over new_blocks loaded from include file 41684 1727204454.74830: in VariableManager get_vars() 41684 1727204454.74848: done with get_vars() 41684 1727204454.74849: filtering new block on tags 41684 1727204454.74871: done filtering new block on tags 41684 1727204454.74874: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 41684 1727204454.74879: extending task lists for all hosts with included blocks 41684 1727204454.75619: done extending task lists 41684 1727204454.75621: done processing included files 41684 1727204454.75622: results queue empty 41684 1727204454.75623: checking for any_errors_fatal 41684 1727204454.75625: done checking for any_errors_fatal 41684 1727204454.75626: checking for max_fail_percentage 41684 1727204454.75627: done checking for max_fail_percentage 41684 1727204454.75628: checking to see if all hosts have failed and the running result is not ok 41684 1727204454.75629: done checking to see if all hosts have failed 41684 1727204454.75629: getting the remaining hosts for this loop 41684 1727204454.75631: done getting the remaining hosts for this loop 41684 1727204454.75633: getting the next task for host managed-node1 41684 1727204454.75637: done getting next task for host managed-node1 41684 1727204454.75639: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 41684 1727204454.75643: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204454.75645: getting variables 41684 1727204454.75646: in VariableManager get_vars() 41684 1727204454.75659: Calling all_inventory to load vars for managed-node1 41684 1727204454.75670: Calling groups_inventory to load vars for managed-node1 41684 1727204454.75674: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204454.75679: Calling all_plugins_play to load vars for managed-node1 41684 1727204454.75682: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204454.75684: Calling groups_plugins_play to load vars for managed-node1 41684 1727204454.75869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204454.76079: done with get_vars() 41684 1727204454.76088: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.037) 0:00:11.163 ***** 41684 1727204454.76160: entering _queue_task() for managed-node1/include_tasks 41684 1727204454.76450: worker is 1 (out of 1 available) 41684 1727204454.76467: exiting _queue_task() for managed-node1/include_tasks 41684 1727204454.76480: done queuing things up, now waiting for results queue to drain 41684 1727204454.76481: waiting for pending results... 41684 1727204454.76750: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 41684 1727204454.76858: in run() - task 0affcd87-79f5-3839-086d-000000000282 41684 1727204454.76881: variable 'ansible_search_path' from source: unknown 41684 1727204454.76889: variable 'ansible_search_path' from source: unknown 41684 1727204454.76930: calling self._execute() 41684 1727204454.77018: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.77029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.77042: variable 'omit' from source: magic vars 41684 1727204454.77396: variable 'ansible_distribution_major_version' from source: facts 41684 1727204454.77414: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204454.77425: _execute() done 41684 1727204454.77434: dumping result to json 41684 1727204454.77442: done dumping result, returning 41684 1727204454.77452: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-3839-086d-000000000282] 41684 1727204454.77468: sending task result for task 0affcd87-79f5-3839-086d-000000000282 41684 1727204454.77565: done sending task result for task 0affcd87-79f5-3839-086d-000000000282 41684 1727204454.77573: WORKER PROCESS EXITING 41684 1727204454.77606: no more pending results, returning what we have 41684 1727204454.77612: in VariableManager get_vars() 41684 1727204454.77672: Calling all_inventory to load vars for managed-node1 41684 1727204454.77676: Calling groups_inventory to load vars for managed-node1 41684 1727204454.77679: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204454.77694: Calling all_plugins_play to load vars for managed-node1 41684 1727204454.77697: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204454.77700: Calling groups_plugins_play to load vars for managed-node1 41684 1727204454.77917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204454.78135: done with get_vars() 41684 1727204454.78144: variable 'ansible_search_path' from source: unknown 41684 1727204454.78145: variable 'ansible_search_path' from source: unknown 41684 1727204454.78190: we have included files to process 41684 1727204454.78191: generating all_blocks data 41684 1727204454.78193: done generating all_blocks data 41684 1727204454.78195: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41684 1727204454.78196: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41684 1727204454.78199: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41684 1727204454.78726: done processing included file 41684 1727204454.78728: iterating over new_blocks loaded from include file 41684 1727204454.78730: in VariableManager get_vars() 41684 1727204454.78748: done with get_vars() 41684 1727204454.78750: filtering new block on tags 41684 1727204454.78772: done filtering new block on tags 41684 1727204454.78774: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 41684 1727204454.78780: extending task lists for all hosts with included blocks 41684 1727204454.78887: done extending task lists 41684 1727204454.78889: done processing included files 41684 1727204454.78890: results queue empty 41684 1727204454.78890: checking for any_errors_fatal 41684 1727204454.78893: done checking for any_errors_fatal 41684 1727204454.78894: checking for max_fail_percentage 41684 1727204454.78895: done checking for max_fail_percentage 41684 1727204454.78896: checking to see if all hosts have failed and the running result is not ok 41684 1727204454.78897: done checking to see if all hosts have failed 41684 1727204454.78898: getting the remaining hosts for this loop 41684 1727204454.78899: done getting the remaining hosts for this loop 41684 1727204454.78901: getting the next task for host managed-node1 41684 1727204454.78906: done getting next task for host managed-node1 41684 1727204454.78908: ^ task is: TASK: Gather current interface info 41684 1727204454.78911: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204454.78913: getting variables 41684 1727204454.78914: in VariableManager get_vars() 41684 1727204454.78927: Calling all_inventory to load vars for managed-node1 41684 1727204454.78929: Calling groups_inventory to load vars for managed-node1 41684 1727204454.78931: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204454.78936: Calling all_plugins_play to load vars for managed-node1 41684 1727204454.78938: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204454.78941: Calling groups_plugins_play to load vars for managed-node1 41684 1727204454.79127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204454.79342: done with get_vars() 41684 1727204454.79351: done getting variables 41684 1727204454.79395: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:00:54 -0400 (0:00:00.032) 0:00:11.195 ***** 41684 1727204454.79424: entering _queue_task() for managed-node1/command 41684 1727204454.79693: worker is 1 (out of 1 available) 41684 1727204454.79705: exiting _queue_task() for managed-node1/command 41684 1727204454.79717: done queuing things up, now waiting for results queue to drain 41684 1727204454.79719: waiting for pending results... 41684 1727204454.79985: running TaskExecutor() for managed-node1/TASK: Gather current interface info 41684 1727204454.80093: in run() - task 0affcd87-79f5-3839-086d-0000000002e0 41684 1727204454.80112: variable 'ansible_search_path' from source: unknown 41684 1727204454.80120: variable 'ansible_search_path' from source: unknown 41684 1727204454.80159: calling self._execute() 41684 1727204454.80248: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.80259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.80281: variable 'omit' from source: magic vars 41684 1727204454.80646: variable 'ansible_distribution_major_version' from source: facts 41684 1727204454.80668: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204454.80680: variable 'omit' from source: magic vars 41684 1727204454.80732: variable 'omit' from source: magic vars 41684 1727204454.80775: variable 'omit' from source: magic vars 41684 1727204454.80823: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204454.80869: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204454.80897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204454.80919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204454.80938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204454.80975: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204454.80984: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.80991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.81102: Set connection var ansible_connection to ssh 41684 1727204454.81114: Set connection var ansible_pipelining to False 41684 1727204454.81124: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204454.81134: Set connection var ansible_timeout to 10 41684 1727204454.81149: Set connection var ansible_shell_executable to /bin/sh 41684 1727204454.81156: Set connection var ansible_shell_type to sh 41684 1727204454.81188: variable 'ansible_shell_executable' from source: unknown 41684 1727204454.81196: variable 'ansible_connection' from source: unknown 41684 1727204454.81204: variable 'ansible_module_compression' from source: unknown 41684 1727204454.81210: variable 'ansible_shell_type' from source: unknown 41684 1727204454.81217: variable 'ansible_shell_executable' from source: unknown 41684 1727204454.81223: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204454.81230: variable 'ansible_pipelining' from source: unknown 41684 1727204454.81237: variable 'ansible_timeout' from source: unknown 41684 1727204454.81246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204454.81397: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204454.81413: variable 'omit' from source: magic vars 41684 1727204454.81423: starting attempt loop 41684 1727204454.81429: running the handler 41684 1727204454.81448: _low_level_execute_command(): starting 41684 1727204454.81460: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204454.82248: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204454.82267: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204454.82283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204454.82301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204454.82348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204454.82359: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204454.82378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.82398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204454.82409: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204454.82419: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204454.82432: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204454.82450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204454.82470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204454.82483: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204454.82494: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204454.82507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.82591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204454.82615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204454.82631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204454.82721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204454.84279: stdout chunk (state=3): >>>/root <<< 41684 1727204454.84484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204454.84487: stdout chunk (state=3): >>><<< 41684 1727204454.84490: stderr chunk (state=3): >>><<< 41684 1727204454.84611: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204454.84614: _low_level_execute_command(): starting 41684 1727204454.84618: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204454.8451147-43015-152698735914477 `" && echo ansible-tmp-1727204454.8451147-43015-152698735914477="` echo /root/.ansible/tmp/ansible-tmp-1727204454.8451147-43015-152698735914477 `" ) && sleep 0' 41684 1727204454.85236: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204454.85253: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204454.85280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204454.85300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204454.85344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204454.85357: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204454.85383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.85401: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204454.85414: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204454.85425: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204454.85437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204454.85450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204454.85470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204454.85489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204454.85501: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204454.85515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.85600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204454.85625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204454.85643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204454.85738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204454.87557: stdout chunk (state=3): >>>ansible-tmp-1727204454.8451147-43015-152698735914477=/root/.ansible/tmp/ansible-tmp-1727204454.8451147-43015-152698735914477 <<< 41684 1727204454.87683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204454.87734: stderr chunk (state=3): >>><<< 41684 1727204454.87737: stdout chunk (state=3): >>><<< 41684 1727204454.87754: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204454.8451147-43015-152698735914477=/root/.ansible/tmp/ansible-tmp-1727204454.8451147-43015-152698735914477 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204454.87782: variable 'ansible_module_compression' from source: unknown 41684 1727204454.87828: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204454.87854: variable 'ansible_facts' from source: unknown 41684 1727204454.87918: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204454.8451147-43015-152698735914477/AnsiballZ_command.py 41684 1727204454.88026: Sending initial data 41684 1727204454.88035: Sent initial data (156 bytes) 41684 1727204454.88715: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204454.88718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204454.88750: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.88753: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204454.88756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.88827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204454.88830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204454.88930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204454.90596: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204454.90647: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204454.90698: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpantsd531 /root/.ansible/tmp/ansible-tmp-1727204454.8451147-43015-152698735914477/AnsiballZ_command.py <<< 41684 1727204454.90748: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204454.91597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204454.91856: stderr chunk (state=3): >>><<< 41684 1727204454.91860: stdout chunk (state=3): >>><<< 41684 1727204454.91862: done transferring module to remote 41684 1727204454.91866: _low_level_execute_command(): starting 41684 1727204454.91869: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204454.8451147-43015-152698735914477/ /root/.ansible/tmp/ansible-tmp-1727204454.8451147-43015-152698735914477/AnsiballZ_command.py && sleep 0' 41684 1727204454.92479: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204454.92494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204454.92510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204454.92536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204454.92589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204454.92602: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204454.92616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.92635: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204454.92653: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204454.92667: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204454.92681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204454.92695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204454.92712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204454.92724: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204454.92735: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204454.92755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.92832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204454.92857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204454.92882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204454.92976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204454.94787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204454.94791: stdout chunk (state=3): >>><<< 41684 1727204454.94797: stderr chunk (state=3): >>><<< 41684 1727204454.94903: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204454.94909: _low_level_execute_command(): starting 41684 1727204454.94912: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204454.8451147-43015-152698735914477/AnsiballZ_command.py && sleep 0' 41684 1727204454.95561: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204454.95587: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204454.95602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204454.95619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204454.95670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204454.95687: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204454.95708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.95725: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204454.95736: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204454.95746: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204454.95757: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204454.95776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204454.95796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204454.95815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204454.95827: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204454.95839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204454.95921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204454.95949: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204454.95969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204454.96074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204455.09521: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:55.091346", "end": "2024-09-24 15:00:55.094549", "delta": "0:00:00.003203", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204455.10685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204455.10869: stderr chunk (state=3): >>><<< 41684 1727204455.10873: stdout chunk (state=3): >>><<< 41684 1727204455.10875: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:55.091346", "end": "2024-09-24 15:00:55.094549", "delta": "0:00:00.003203", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204455.10883: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204454.8451147-43015-152698735914477/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204455.10886: _low_level_execute_command(): starting 41684 1727204455.10888: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204454.8451147-43015-152698735914477/ > /dev/null 2>&1 && sleep 0' 41684 1727204455.11496: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204455.11518: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204455.11532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204455.11548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204455.11608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204455.11632: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204455.11659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.11679: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204455.11691: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204455.11700: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204455.11711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204455.11732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204455.11750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204455.11762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204455.11775: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204455.11788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.11872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204455.11894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204455.11908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204455.11993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204455.13867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204455.13871: stdout chunk (state=3): >>><<< 41684 1727204455.13874: stderr chunk (state=3): >>><<< 41684 1727204455.14076: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204455.14080: handler run complete 41684 1727204455.14082: Evaluated conditional (False): False 41684 1727204455.14085: attempt loop complete, returning result 41684 1727204455.14087: _execute() done 41684 1727204455.14088: dumping result to json 41684 1727204455.14090: done dumping result, returning 41684 1727204455.14092: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [0affcd87-79f5-3839-086d-0000000002e0] 41684 1727204455.14095: sending task result for task 0affcd87-79f5-3839-086d-0000000002e0 41684 1727204455.14350: done sending task result for task 0affcd87-79f5-3839-086d-0000000002e0 ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003203", "end": "2024-09-24 15:00:55.094549", "rc": 0, "start": "2024-09-24 15:00:55.091346" } STDOUT: bonding_masters eth0 ethtest0 lo peerethtest0 rpltstbr 41684 1727204455.14434: no more pending results, returning what we have 41684 1727204455.14437: results queue empty 41684 1727204455.14438: checking for any_errors_fatal 41684 1727204455.14441: done checking for any_errors_fatal 41684 1727204455.14442: checking for max_fail_percentage 41684 1727204455.14444: done checking for max_fail_percentage 41684 1727204455.14445: checking to see if all hosts have failed and the running result is not ok 41684 1727204455.14445: done checking to see if all hosts have failed 41684 1727204455.14446: getting the remaining hosts for this loop 41684 1727204455.14448: done getting the remaining hosts for this loop 41684 1727204455.14451: getting the next task for host managed-node1 41684 1727204455.14457: done getting next task for host managed-node1 41684 1727204455.14459: ^ task is: TASK: Set current_interfaces 41684 1727204455.14468: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204455.14471: getting variables 41684 1727204455.14472: in VariableManager get_vars() 41684 1727204455.14516: Calling all_inventory to load vars for managed-node1 41684 1727204455.14519: Calling groups_inventory to load vars for managed-node1 41684 1727204455.14521: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204455.14531: Calling all_plugins_play to load vars for managed-node1 41684 1727204455.14533: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204455.14536: Calling groups_plugins_play to load vars for managed-node1 41684 1727204455.14728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204455.14952: done with get_vars() 41684 1727204455.14958: WORKER PROCESS EXITING 41684 1727204455.14975: done getting variables 41684 1727204455.15032: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:00:55 -0400 (0:00:00.356) 0:00:11.552 ***** 41684 1727204455.15059: entering _queue_task() for managed-node1/set_fact 41684 1727204455.15342: worker is 1 (out of 1 available) 41684 1727204455.15357: exiting _queue_task() for managed-node1/set_fact 41684 1727204455.15374: done queuing things up, now waiting for results queue to drain 41684 1727204455.15376: waiting for pending results... 41684 1727204455.15678: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 41684 1727204455.15803: in run() - task 0affcd87-79f5-3839-086d-0000000002e1 41684 1727204455.15830: variable 'ansible_search_path' from source: unknown 41684 1727204455.15839: variable 'ansible_search_path' from source: unknown 41684 1727204455.15891: calling self._execute() 41684 1727204455.15987: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.15998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.16012: variable 'omit' from source: magic vars 41684 1727204455.16814: variable 'ansible_distribution_major_version' from source: facts 41684 1727204455.16833: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204455.16850: variable 'omit' from source: magic vars 41684 1727204455.16904: variable 'omit' from source: magic vars 41684 1727204455.17023: variable '_current_interfaces' from source: set_fact 41684 1727204455.17096: variable 'omit' from source: magic vars 41684 1727204455.17144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204455.17191: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204455.17218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204455.17246: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204455.17260: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204455.17306: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204455.17315: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.17324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.17438: Set connection var ansible_connection to ssh 41684 1727204455.17452: Set connection var ansible_pipelining to False 41684 1727204455.17470: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204455.17482: Set connection var ansible_timeout to 10 41684 1727204455.17499: Set connection var ansible_shell_executable to /bin/sh 41684 1727204455.17507: Set connection var ansible_shell_type to sh 41684 1727204455.17535: variable 'ansible_shell_executable' from source: unknown 41684 1727204455.17542: variable 'ansible_connection' from source: unknown 41684 1727204455.17549: variable 'ansible_module_compression' from source: unknown 41684 1727204455.17556: variable 'ansible_shell_type' from source: unknown 41684 1727204455.17570: variable 'ansible_shell_executable' from source: unknown 41684 1727204455.17579: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.17587: variable 'ansible_pipelining' from source: unknown 41684 1727204455.17594: variable 'ansible_timeout' from source: unknown 41684 1727204455.17607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.17752: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204455.17771: variable 'omit' from source: magic vars 41684 1727204455.17780: starting attempt loop 41684 1727204455.17789: running the handler 41684 1727204455.17801: handler run complete 41684 1727204455.17812: attempt loop complete, returning result 41684 1727204455.17818: _execute() done 41684 1727204455.17827: dumping result to json 41684 1727204455.17833: done dumping result, returning 41684 1727204455.17841: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [0affcd87-79f5-3839-086d-0000000002e1] 41684 1727204455.17849: sending task result for task 0affcd87-79f5-3839-086d-0000000002e1 ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "ethtest0", "lo", "peerethtest0", "rpltstbr" ] }, "changed": false } 41684 1727204455.17994: no more pending results, returning what we have 41684 1727204455.17997: results queue empty 41684 1727204455.17998: checking for any_errors_fatal 41684 1727204455.18005: done checking for any_errors_fatal 41684 1727204455.18006: checking for max_fail_percentage 41684 1727204455.18008: done checking for max_fail_percentage 41684 1727204455.18008: checking to see if all hosts have failed and the running result is not ok 41684 1727204455.18009: done checking to see if all hosts have failed 41684 1727204455.18010: getting the remaining hosts for this loop 41684 1727204455.18011: done getting the remaining hosts for this loop 41684 1727204455.18015: getting the next task for host managed-node1 41684 1727204455.18023: done getting next task for host managed-node1 41684 1727204455.18026: ^ task is: TASK: Show current_interfaces 41684 1727204455.18029: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204455.18032: getting variables 41684 1727204455.18034: in VariableManager get_vars() 41684 1727204455.18077: Calling all_inventory to load vars for managed-node1 41684 1727204455.18080: Calling groups_inventory to load vars for managed-node1 41684 1727204455.18082: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204455.18093: Calling all_plugins_play to load vars for managed-node1 41684 1727204455.18095: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204455.18098: Calling groups_plugins_play to load vars for managed-node1 41684 1727204455.18528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204455.19007: done with get_vars() 41684 1727204455.19021: done getting variables 41684 1727204455.19051: done sending task result for task 0affcd87-79f5-3839-086d-0000000002e1 41684 1727204455.19054: WORKER PROCESS EXITING 41684 1727204455.19093: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:00:55 -0400 (0:00:00.040) 0:00:11.592 ***** 41684 1727204455.19120: entering _queue_task() for managed-node1/debug 41684 1727204455.19379: worker is 1 (out of 1 available) 41684 1727204455.19391: exiting _queue_task() for managed-node1/debug 41684 1727204455.19404: done queuing things up, now waiting for results queue to drain 41684 1727204455.19405: waiting for pending results... 41684 1727204455.19675: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 41684 1727204455.19788: in run() - task 0affcd87-79f5-3839-086d-000000000283 41684 1727204455.19811: variable 'ansible_search_path' from source: unknown 41684 1727204455.19820: variable 'ansible_search_path' from source: unknown 41684 1727204455.19867: calling self._execute() 41684 1727204455.19950: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.19969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.19984: variable 'omit' from source: magic vars 41684 1727204455.20372: variable 'ansible_distribution_major_version' from source: facts 41684 1727204455.20390: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204455.20406: variable 'omit' from source: magic vars 41684 1727204455.20451: variable 'omit' from source: magic vars 41684 1727204455.20560: variable 'current_interfaces' from source: set_fact 41684 1727204455.20595: variable 'omit' from source: magic vars 41684 1727204455.20644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204455.20693: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204455.20720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204455.20745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204455.20760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204455.20800: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204455.20810: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.20818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.20932: Set connection var ansible_connection to ssh 41684 1727204455.20949: Set connection var ansible_pipelining to False 41684 1727204455.20959: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204455.20975: Set connection var ansible_timeout to 10 41684 1727204455.20992: Set connection var ansible_shell_executable to /bin/sh 41684 1727204455.20999: Set connection var ansible_shell_type to sh 41684 1727204455.21030: variable 'ansible_shell_executable' from source: unknown 41684 1727204455.21038: variable 'ansible_connection' from source: unknown 41684 1727204455.21047: variable 'ansible_module_compression' from source: unknown 41684 1727204455.21057: variable 'ansible_shell_type' from source: unknown 41684 1727204455.21068: variable 'ansible_shell_executable' from source: unknown 41684 1727204455.21076: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.21084: variable 'ansible_pipelining' from source: unknown 41684 1727204455.21093: variable 'ansible_timeout' from source: unknown 41684 1727204455.21104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.21253: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204455.21277: variable 'omit' from source: magic vars 41684 1727204455.21287: starting attempt loop 41684 1727204455.21294: running the handler 41684 1727204455.21344: handler run complete 41684 1727204455.21367: attempt loop complete, returning result 41684 1727204455.21377: _execute() done 41684 1727204455.21386: dumping result to json 41684 1727204455.21394: done dumping result, returning 41684 1727204455.21405: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [0affcd87-79f5-3839-086d-000000000283] 41684 1727204455.21415: sending task result for task 0affcd87-79f5-3839-086d-000000000283 ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'ethtest0', 'lo', 'peerethtest0', 'rpltstbr'] 41684 1727204455.21570: no more pending results, returning what we have 41684 1727204455.21575: results queue empty 41684 1727204455.21576: checking for any_errors_fatal 41684 1727204455.21582: done checking for any_errors_fatal 41684 1727204455.21583: checking for max_fail_percentage 41684 1727204455.21584: done checking for max_fail_percentage 41684 1727204455.21585: checking to see if all hosts have failed and the running result is not ok 41684 1727204455.21586: done checking to see if all hosts have failed 41684 1727204455.21587: getting the remaining hosts for this loop 41684 1727204455.21588: done getting the remaining hosts for this loop 41684 1727204455.21592: getting the next task for host managed-node1 41684 1727204455.21601: done getting next task for host managed-node1 41684 1727204455.21605: ^ task is: TASK: Manage test interface 41684 1727204455.21607: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204455.21610: getting variables 41684 1727204455.21612: in VariableManager get_vars() 41684 1727204455.21659: Calling all_inventory to load vars for managed-node1 41684 1727204455.21665: Calling groups_inventory to load vars for managed-node1 41684 1727204455.21668: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204455.21680: Calling all_plugins_play to load vars for managed-node1 41684 1727204455.21683: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204455.21686: Calling groups_plugins_play to load vars for managed-node1 41684 1727204455.21890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204455.22135: done with get_vars() 41684 1727204455.22146: done getting variables 41684 1727204455.22277: done sending task result for task 0affcd87-79f5-3839-086d-000000000283 41684 1727204455.22280: WORKER PROCESS EXITING TASK [Manage test interface] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:28 Tuesday 24 September 2024 15:00:55 -0400 (0:00:00.033) 0:00:11.626 ***** 41684 1727204455.22473: entering _queue_task() for managed-node1/include_tasks 41684 1727204455.22834: worker is 1 (out of 1 available) 41684 1727204455.22848: exiting _queue_task() for managed-node1/include_tasks 41684 1727204455.22866: done queuing things up, now waiting for results queue to drain 41684 1727204455.22868: waiting for pending results... 41684 1727204455.23029: running TaskExecutor() for managed-node1/TASK: Manage test interface 41684 1727204455.23088: in run() - task 0affcd87-79f5-3839-086d-000000000011 41684 1727204455.23098: variable 'ansible_search_path' from source: unknown 41684 1727204455.23130: calling self._execute() 41684 1727204455.23259: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.23266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.23275: variable 'omit' from source: magic vars 41684 1727204455.23579: variable 'ansible_distribution_major_version' from source: facts 41684 1727204455.23610: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204455.23621: _execute() done 41684 1727204455.23628: dumping result to json 41684 1727204455.23635: done dumping result, returning 41684 1727204455.23662: done running TaskExecutor() for managed-node1/TASK: Manage test interface [0affcd87-79f5-3839-086d-000000000011] 41684 1727204455.23676: sending task result for task 0affcd87-79f5-3839-086d-000000000011 41684 1727204455.23801: no more pending results, returning what we have 41684 1727204455.23806: in VariableManager get_vars() 41684 1727204455.23915: Calling all_inventory to load vars for managed-node1 41684 1727204455.23918: Calling groups_inventory to load vars for managed-node1 41684 1727204455.23921: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204455.23927: done sending task result for task 0affcd87-79f5-3839-086d-000000000011 41684 1727204455.23930: WORKER PROCESS EXITING 41684 1727204455.23946: Calling all_plugins_play to load vars for managed-node1 41684 1727204455.23950: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204455.23953: Calling groups_plugins_play to load vars for managed-node1 41684 1727204455.24255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204455.24489: done with get_vars() 41684 1727204455.24497: variable 'ansible_search_path' from source: unknown 41684 1727204455.24516: we have included files to process 41684 1727204455.24517: generating all_blocks data 41684 1727204455.24519: done generating all_blocks data 41684 1727204455.24523: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 41684 1727204455.24524: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 41684 1727204455.24526: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 41684 1727204455.25513: in VariableManager get_vars() 41684 1727204455.25537: done with get_vars() 41684 1727204455.26626: done processing included file 41684 1727204455.26628: iterating over new_blocks loaded from include file 41684 1727204455.26629: in VariableManager get_vars() 41684 1727204455.26648: done with get_vars() 41684 1727204455.26650: filtering new block on tags 41684 1727204455.26683: done filtering new block on tags 41684 1727204455.26685: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node1 41684 1727204455.26691: extending task lists for all hosts with included blocks 41684 1727204455.28894: done extending task lists 41684 1727204455.28897: done processing included files 41684 1727204455.28898: results queue empty 41684 1727204455.28899: checking for any_errors_fatal 41684 1727204455.28902: done checking for any_errors_fatal 41684 1727204455.28903: checking for max_fail_percentage 41684 1727204455.28904: done checking for max_fail_percentage 41684 1727204455.28905: checking to see if all hosts have failed and the running result is not ok 41684 1727204455.28905: done checking to see if all hosts have failed 41684 1727204455.28906: getting the remaining hosts for this loop 41684 1727204455.28908: done getting the remaining hosts for this loop 41684 1727204455.28910: getting the next task for host managed-node1 41684 1727204455.28914: done getting next task for host managed-node1 41684 1727204455.28917: ^ task is: TASK: Ensure state in ["present", "absent"] 41684 1727204455.28919: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204455.28922: getting variables 41684 1727204455.28923: in VariableManager get_vars() 41684 1727204455.28941: Calling all_inventory to load vars for managed-node1 41684 1727204455.28943: Calling groups_inventory to load vars for managed-node1 41684 1727204455.28945: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204455.28950: Calling all_plugins_play to load vars for managed-node1 41684 1727204455.28952: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204455.28954: Calling groups_plugins_play to load vars for managed-node1 41684 1727204455.29096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204455.29413: done with get_vars() 41684 1727204455.29425: done getting variables 41684 1727204455.29474: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 15:00:55 -0400 (0:00:00.070) 0:00:11.696 ***** 41684 1727204455.29504: entering _queue_task() for managed-node1/fail 41684 1727204455.29803: worker is 1 (out of 1 available) 41684 1727204455.29815: exiting _queue_task() for managed-node1/fail 41684 1727204455.29827: done queuing things up, now waiting for results queue to drain 41684 1727204455.29828: waiting for pending results... 41684 1727204455.30226: running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] 41684 1727204455.30332: in run() - task 0affcd87-79f5-3839-086d-0000000002fc 41684 1727204455.30355: variable 'ansible_search_path' from source: unknown 41684 1727204455.30365: variable 'ansible_search_path' from source: unknown 41684 1727204455.30413: calling self._execute() 41684 1727204455.30509: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.30521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.30536: variable 'omit' from source: magic vars 41684 1727204455.30930: variable 'ansible_distribution_major_version' from source: facts 41684 1727204455.30947: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204455.31092: variable 'state' from source: include params 41684 1727204455.31104: Evaluated conditional (state not in ["present", "absent"]): False 41684 1727204455.31111: when evaluation is False, skipping this task 41684 1727204455.31118: _execute() done 41684 1727204455.31125: dumping result to json 41684 1727204455.31133: done dumping result, returning 41684 1727204455.31145: done running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] [0affcd87-79f5-3839-086d-0000000002fc] 41684 1727204455.31154: sending task result for task 0affcd87-79f5-3839-086d-0000000002fc skipping: [managed-node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 41684 1727204455.31303: no more pending results, returning what we have 41684 1727204455.31308: results queue empty 41684 1727204455.31309: checking for any_errors_fatal 41684 1727204455.31311: done checking for any_errors_fatal 41684 1727204455.31312: checking for max_fail_percentage 41684 1727204455.31313: done checking for max_fail_percentage 41684 1727204455.31314: checking to see if all hosts have failed and the running result is not ok 41684 1727204455.31315: done checking to see if all hosts have failed 41684 1727204455.31316: getting the remaining hosts for this loop 41684 1727204455.31318: done getting the remaining hosts for this loop 41684 1727204455.31322: getting the next task for host managed-node1 41684 1727204455.31329: done getting next task for host managed-node1 41684 1727204455.31332: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 41684 1727204455.31335: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204455.31339: getting variables 41684 1727204455.31341: in VariableManager get_vars() 41684 1727204455.31386: Calling all_inventory to load vars for managed-node1 41684 1727204455.31390: Calling groups_inventory to load vars for managed-node1 41684 1727204455.31392: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204455.31406: Calling all_plugins_play to load vars for managed-node1 41684 1727204455.31409: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204455.31412: Calling groups_plugins_play to load vars for managed-node1 41684 1727204455.31690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204455.32002: done with get_vars() 41684 1727204455.32030: done getting variables 41684 1727204455.32182: done sending task result for task 0affcd87-79f5-3839-086d-0000000002fc 41684 1727204455.32186: WORKER PROCESS EXITING 41684 1727204455.32229: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 15:00:55 -0400 (0:00:00.027) 0:00:11.724 ***** 41684 1727204455.32470: entering _queue_task() for managed-node1/fail 41684 1727204455.32712: worker is 1 (out of 1 available) 41684 1727204455.32724: exiting _queue_task() for managed-node1/fail 41684 1727204455.32736: done queuing things up, now waiting for results queue to drain 41684 1727204455.32738: waiting for pending results... 41684 1727204455.33009: running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] 41684 1727204455.33111: in run() - task 0affcd87-79f5-3839-086d-0000000002fd 41684 1727204455.33132: variable 'ansible_search_path' from source: unknown 41684 1727204455.33139: variable 'ansible_search_path' from source: unknown 41684 1727204455.33188: calling self._execute() 41684 1727204455.33276: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.33289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.33304: variable 'omit' from source: magic vars 41684 1727204455.33656: variable 'ansible_distribution_major_version' from source: facts 41684 1727204455.33677: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204455.35314: variable 'type' from source: set_fact 41684 1727204455.35327: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 41684 1727204455.35335: when evaluation is False, skipping this task 41684 1727204455.35343: _execute() done 41684 1727204455.35351: dumping result to json 41684 1727204455.35358: done dumping result, returning 41684 1727204455.35373: done running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] [0affcd87-79f5-3839-086d-0000000002fd] 41684 1727204455.35389: sending task result for task 0affcd87-79f5-3839-086d-0000000002fd skipping: [managed-node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 41684 1727204455.35542: no more pending results, returning what we have 41684 1727204455.35547: results queue empty 41684 1727204455.35549: checking for any_errors_fatal 41684 1727204455.35557: done checking for any_errors_fatal 41684 1727204455.35558: checking for max_fail_percentage 41684 1727204455.35560: done checking for max_fail_percentage 41684 1727204455.35560: checking to see if all hosts have failed and the running result is not ok 41684 1727204455.35561: done checking to see if all hosts have failed 41684 1727204455.35562: getting the remaining hosts for this loop 41684 1727204455.35566: done getting the remaining hosts for this loop 41684 1727204455.35570: getting the next task for host managed-node1 41684 1727204455.35578: done getting next task for host managed-node1 41684 1727204455.35581: ^ task is: TASK: Include the task 'show_interfaces.yml' 41684 1727204455.35584: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204455.35589: getting variables 41684 1727204455.35591: in VariableManager get_vars() 41684 1727204455.35637: Calling all_inventory to load vars for managed-node1 41684 1727204455.35641: Calling groups_inventory to load vars for managed-node1 41684 1727204455.35644: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204455.35659: Calling all_plugins_play to load vars for managed-node1 41684 1727204455.35662: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204455.36132: Calling groups_plugins_play to load vars for managed-node1 41684 1727204455.36415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204455.36904: done with get_vars() 41684 1727204455.36916: done getting variables 41684 1727204455.37060: done sending task result for task 0affcd87-79f5-3839-086d-0000000002fd 41684 1727204455.37065: WORKER PROCESS EXITING TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 15:00:55 -0400 (0:00:00.048) 0:00:11.773 ***** 41684 1727204455.37165: entering _queue_task() for managed-node1/include_tasks 41684 1727204455.37532: worker is 1 (out of 1 available) 41684 1727204455.37544: exiting _queue_task() for managed-node1/include_tasks 41684 1727204455.37558: done queuing things up, now waiting for results queue to drain 41684 1727204455.37559: waiting for pending results... 41684 1727204455.37811: running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' 41684 1727204455.37911: in run() - task 0affcd87-79f5-3839-086d-0000000002fe 41684 1727204455.37934: variable 'ansible_search_path' from source: unknown 41684 1727204455.37942: variable 'ansible_search_path' from source: unknown 41684 1727204455.37984: calling self._execute() 41684 1727204455.38071: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.38082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.38094: variable 'omit' from source: magic vars 41684 1727204455.38481: variable 'ansible_distribution_major_version' from source: facts 41684 1727204455.38499: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204455.38509: _execute() done 41684 1727204455.38517: dumping result to json 41684 1727204455.38523: done dumping result, returning 41684 1727204455.38531: done running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-3839-086d-0000000002fe] 41684 1727204455.38541: sending task result for task 0affcd87-79f5-3839-086d-0000000002fe 41684 1727204455.38647: done sending task result for task 0affcd87-79f5-3839-086d-0000000002fe 41684 1727204455.38684: no more pending results, returning what we have 41684 1727204455.38690: in VariableManager get_vars() 41684 1727204455.38743: Calling all_inventory to load vars for managed-node1 41684 1727204455.38746: Calling groups_inventory to load vars for managed-node1 41684 1727204455.38749: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204455.38766: Calling all_plugins_play to load vars for managed-node1 41684 1727204455.38770: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204455.38773: Calling groups_plugins_play to load vars for managed-node1 41684 1727204455.39022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204455.39300: done with get_vars() 41684 1727204455.39309: variable 'ansible_search_path' from source: unknown 41684 1727204455.39310: variable 'ansible_search_path' from source: unknown 41684 1727204455.39357: we have included files to process 41684 1727204455.39359: generating all_blocks data 41684 1727204455.39361: done generating all_blocks data 41684 1727204455.39369: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41684 1727204455.39370: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41684 1727204455.39373: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41684 1727204455.39721: in VariableManager get_vars() 41684 1727204455.39747: done with get_vars() 41684 1727204455.39880: WORKER PROCESS EXITING 41684 1727204455.39970: done processing included file 41684 1727204455.39972: iterating over new_blocks loaded from include file 41684 1727204455.39974: in VariableManager get_vars() 41684 1727204455.39994: done with get_vars() 41684 1727204455.39996: filtering new block on tags 41684 1727204455.40015: done filtering new block on tags 41684 1727204455.40017: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 41684 1727204455.40022: extending task lists for all hosts with included blocks 41684 1727204455.40471: done extending task lists 41684 1727204455.40472: done processing included files 41684 1727204455.40473: results queue empty 41684 1727204455.40474: checking for any_errors_fatal 41684 1727204455.40477: done checking for any_errors_fatal 41684 1727204455.40478: checking for max_fail_percentage 41684 1727204455.40479: done checking for max_fail_percentage 41684 1727204455.40480: checking to see if all hosts have failed and the running result is not ok 41684 1727204455.40481: done checking to see if all hosts have failed 41684 1727204455.40481: getting the remaining hosts for this loop 41684 1727204455.40483: done getting the remaining hosts for this loop 41684 1727204455.40485: getting the next task for host managed-node1 41684 1727204455.40489: done getting next task for host managed-node1 41684 1727204455.40491: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 41684 1727204455.40494: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204455.40497: getting variables 41684 1727204455.40498: in VariableManager get_vars() 41684 1727204455.40511: Calling all_inventory to load vars for managed-node1 41684 1727204455.40513: Calling groups_inventory to load vars for managed-node1 41684 1727204455.40515: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204455.40520: Calling all_plugins_play to load vars for managed-node1 41684 1727204455.40522: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204455.40525: Calling groups_plugins_play to load vars for managed-node1 41684 1727204455.40679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204455.40892: done with get_vars() 41684 1727204455.40901: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:00:55 -0400 (0:00:00.038) 0:00:11.811 ***** 41684 1727204455.40967: entering _queue_task() for managed-node1/include_tasks 41684 1727204455.41232: worker is 1 (out of 1 available) 41684 1727204455.41246: exiting _queue_task() for managed-node1/include_tasks 41684 1727204455.41260: done queuing things up, now waiting for results queue to drain 41684 1727204455.41267: waiting for pending results... 41684 1727204455.41601: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 41684 1727204455.41714: in run() - task 0affcd87-79f5-3839-086d-000000000374 41684 1727204455.41734: variable 'ansible_search_path' from source: unknown 41684 1727204455.41746: variable 'ansible_search_path' from source: unknown 41684 1727204455.41793: calling self._execute() 41684 1727204455.41915: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.41926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.41940: variable 'omit' from source: magic vars 41684 1727204455.42323: variable 'ansible_distribution_major_version' from source: facts 41684 1727204455.42342: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204455.42354: _execute() done 41684 1727204455.42368: dumping result to json 41684 1727204455.42378: done dumping result, returning 41684 1727204455.42388: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-3839-086d-000000000374] 41684 1727204455.42402: sending task result for task 0affcd87-79f5-3839-086d-000000000374 41684 1727204455.42527: no more pending results, returning what we have 41684 1727204455.42532: in VariableManager get_vars() 41684 1727204455.42585: Calling all_inventory to load vars for managed-node1 41684 1727204455.42588: Calling groups_inventory to load vars for managed-node1 41684 1727204455.42591: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204455.42605: Calling all_plugins_play to load vars for managed-node1 41684 1727204455.42609: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204455.42612: Calling groups_plugins_play to load vars for managed-node1 41684 1727204455.42823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204455.43106: done with get_vars() 41684 1727204455.43121: variable 'ansible_search_path' from source: unknown 41684 1727204455.43123: variable 'ansible_search_path' from source: unknown 41684 1727204455.43290: done sending task result for task 0affcd87-79f5-3839-086d-000000000374 41684 1727204455.43293: WORKER PROCESS EXITING 41684 1727204455.43527: we have included files to process 41684 1727204455.43528: generating all_blocks data 41684 1727204455.43530: done generating all_blocks data 41684 1727204455.43531: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41684 1727204455.43532: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41684 1727204455.43535: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41684 1727204455.43880: done processing included file 41684 1727204455.43882: iterating over new_blocks loaded from include file 41684 1727204455.43884: in VariableManager get_vars() 41684 1727204455.43903: done with get_vars() 41684 1727204455.43905: filtering new block on tags 41684 1727204455.43923: done filtering new block on tags 41684 1727204455.43925: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 41684 1727204455.43930: extending task lists for all hosts with included blocks 41684 1727204455.44083: done extending task lists 41684 1727204455.44085: done processing included files 41684 1727204455.44085: results queue empty 41684 1727204455.44086: checking for any_errors_fatal 41684 1727204455.44089: done checking for any_errors_fatal 41684 1727204455.44090: checking for max_fail_percentage 41684 1727204455.44091: done checking for max_fail_percentage 41684 1727204455.44092: checking to see if all hosts have failed and the running result is not ok 41684 1727204455.44093: done checking to see if all hosts have failed 41684 1727204455.44094: getting the remaining hosts for this loop 41684 1727204455.44095: done getting the remaining hosts for this loop 41684 1727204455.44097: getting the next task for host managed-node1 41684 1727204455.44101: done getting next task for host managed-node1 41684 1727204455.44104: ^ task is: TASK: Gather current interface info 41684 1727204455.44107: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204455.44109: getting variables 41684 1727204455.44110: in VariableManager get_vars() 41684 1727204455.44122: Calling all_inventory to load vars for managed-node1 41684 1727204455.44124: Calling groups_inventory to load vars for managed-node1 41684 1727204455.44126: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204455.44131: Calling all_plugins_play to load vars for managed-node1 41684 1727204455.44133: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204455.44136: Calling groups_plugins_play to load vars for managed-node1 41684 1727204455.44294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204455.44515: done with get_vars() 41684 1727204455.44523: done getting variables 41684 1727204455.44565: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:00:55 -0400 (0:00:00.036) 0:00:11.847 ***** 41684 1727204455.44594: entering _queue_task() for managed-node1/command 41684 1727204455.44844: worker is 1 (out of 1 available) 41684 1727204455.44858: exiting _queue_task() for managed-node1/command 41684 1727204455.44875: done queuing things up, now waiting for results queue to drain 41684 1727204455.44876: waiting for pending results... 41684 1727204455.45136: running TaskExecutor() for managed-node1/TASK: Gather current interface info 41684 1727204455.45253: in run() - task 0affcd87-79f5-3839-086d-0000000003ab 41684 1727204455.45280: variable 'ansible_search_path' from source: unknown 41684 1727204455.45288: variable 'ansible_search_path' from source: unknown 41684 1727204455.45330: calling self._execute() 41684 1727204455.45412: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.45427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.45441: variable 'omit' from source: magic vars 41684 1727204455.45877: variable 'ansible_distribution_major_version' from source: facts 41684 1727204455.45895: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204455.45906: variable 'omit' from source: magic vars 41684 1727204455.45965: variable 'omit' from source: magic vars 41684 1727204455.46006: variable 'omit' from source: magic vars 41684 1727204455.46049: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204455.46099: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204455.46124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204455.46144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204455.46158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204455.46196: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204455.46205: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.46212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.46318: Set connection var ansible_connection to ssh 41684 1727204455.46329: Set connection var ansible_pipelining to False 41684 1727204455.46338: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204455.46346: Set connection var ansible_timeout to 10 41684 1727204455.46357: Set connection var ansible_shell_executable to /bin/sh 41684 1727204455.46367: Set connection var ansible_shell_type to sh 41684 1727204455.46394: variable 'ansible_shell_executable' from source: unknown 41684 1727204455.46405: variable 'ansible_connection' from source: unknown 41684 1727204455.46413: variable 'ansible_module_compression' from source: unknown 41684 1727204455.46419: variable 'ansible_shell_type' from source: unknown 41684 1727204455.46425: variable 'ansible_shell_executable' from source: unknown 41684 1727204455.46431: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.46437: variable 'ansible_pipelining' from source: unknown 41684 1727204455.46444: variable 'ansible_timeout' from source: unknown 41684 1727204455.46451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.46597: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204455.46613: variable 'omit' from source: magic vars 41684 1727204455.46627: starting attempt loop 41684 1727204455.46634: running the handler 41684 1727204455.46654: _low_level_execute_command(): starting 41684 1727204455.46673: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204455.47455: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204455.47477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204455.47493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204455.47514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204455.47567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204455.47582: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204455.47597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.47618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204455.47634: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204455.47646: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204455.47658: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204455.47677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204455.47693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204455.47706: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204455.47718: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204455.47733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.47815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204455.47841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204455.47868: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204455.47966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204455.49618: stdout chunk (state=3): >>>/root <<< 41684 1727204455.49784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204455.49838: stderr chunk (state=3): >>><<< 41684 1727204455.49842: stdout chunk (state=3): >>><<< 41684 1727204455.49870: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204455.49972: _low_level_execute_command(): starting 41684 1727204455.49976: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204455.4986491-43057-133973602567583 `" && echo ansible-tmp-1727204455.4986491-43057-133973602567583="` echo /root/.ansible/tmp/ansible-tmp-1727204455.4986491-43057-133973602567583 `" ) && sleep 0' 41684 1727204455.50845: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204455.50849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204455.50891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204455.50894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41684 1727204455.50896: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204455.50899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.50972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204455.50986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204455.51082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204455.52926: stdout chunk (state=3): >>>ansible-tmp-1727204455.4986491-43057-133973602567583=/root/.ansible/tmp/ansible-tmp-1727204455.4986491-43057-133973602567583 <<< 41684 1727204455.53130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204455.53134: stdout chunk (state=3): >>><<< 41684 1727204455.53137: stderr chunk (state=3): >>><<< 41684 1727204455.53170: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204455.4986491-43057-133973602567583=/root/.ansible/tmp/ansible-tmp-1727204455.4986491-43057-133973602567583 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204455.53473: variable 'ansible_module_compression' from source: unknown 41684 1727204455.53476: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204455.53479: variable 'ansible_facts' from source: unknown 41684 1727204455.53481: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204455.4986491-43057-133973602567583/AnsiballZ_command.py 41684 1727204455.53536: Sending initial data 41684 1727204455.53539: Sent initial data (156 bytes) 41684 1727204455.54540: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204455.54555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204455.54576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204455.54596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204455.54639: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204455.54652: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204455.54674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.54696: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204455.54709: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204455.54721: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204455.54733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204455.54748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204455.54769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204455.54783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204455.54796: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204455.54810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.54895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204455.54912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204455.54927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204455.55238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204455.56933: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204455.57002: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204455.57069: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmppj6wdtvt /root/.ansible/tmp/ansible-tmp-1727204455.4986491-43057-133973602567583/AnsiballZ_command.py <<< 41684 1727204455.57106: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204455.58274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204455.58471: stderr chunk (state=3): >>><<< 41684 1727204455.58474: stdout chunk (state=3): >>><<< 41684 1727204455.58477: done transferring module to remote 41684 1727204455.58479: _low_level_execute_command(): starting 41684 1727204455.58481: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204455.4986491-43057-133973602567583/ /root/.ansible/tmp/ansible-tmp-1727204455.4986491-43057-133973602567583/AnsiballZ_command.py && sleep 0' 41684 1727204455.59053: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204455.59057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204455.59092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.59097: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204455.59099: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.59146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204455.59150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204455.59219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204455.60991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204455.60994: stdout chunk (state=3): >>><<< 41684 1727204455.60996: stderr chunk (state=3): >>><<< 41684 1727204455.61086: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204455.61090: _low_level_execute_command(): starting 41684 1727204455.61092: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204455.4986491-43057-133973602567583/AnsiballZ_command.py && sleep 0' 41684 1727204455.61670: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204455.61681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204455.61710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.61713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204455.61715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.61760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204455.61780: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204455.61791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204455.61857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204455.75106: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:55.747416", "end": "2024-09-24 15:00:55.750375", "delta": "0:00:00.002959", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204455.76188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204455.76243: stderr chunk (state=3): >>><<< 41684 1727204455.76247: stdout chunk (state=3): >>><<< 41684 1727204455.76261: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:55.747416", "end": "2024-09-24 15:00:55.750375", "delta": "0:00:00.002959", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204455.76298: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204455.4986491-43057-133973602567583/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204455.76307: _low_level_execute_command(): starting 41684 1727204455.76310: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204455.4986491-43057-133973602567583/ > /dev/null 2>&1 && sleep 0' 41684 1727204455.76786: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204455.76790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204455.76824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.76827: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204455.76829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.76877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204455.76888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204455.76953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204455.78684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204455.78738: stderr chunk (state=3): >>><<< 41684 1727204455.78741: stdout chunk (state=3): >>><<< 41684 1727204455.78755: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204455.78768: handler run complete 41684 1727204455.78788: Evaluated conditional (False): False 41684 1727204455.78797: attempt loop complete, returning result 41684 1727204455.78800: _execute() done 41684 1727204455.78802: dumping result to json 41684 1727204455.78807: done dumping result, returning 41684 1727204455.78814: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [0affcd87-79f5-3839-086d-0000000003ab] 41684 1727204455.78819: sending task result for task 0affcd87-79f5-3839-086d-0000000003ab 41684 1727204455.78922: done sending task result for task 0affcd87-79f5-3839-086d-0000000003ab 41684 1727204455.78924: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.002959", "end": "2024-09-24 15:00:55.750375", "rc": 0, "start": "2024-09-24 15:00:55.747416" } STDOUT: bonding_masters eth0 ethtest0 lo peerethtest0 rpltstbr 41684 1727204455.78998: no more pending results, returning what we have 41684 1727204455.79002: results queue empty 41684 1727204455.79003: checking for any_errors_fatal 41684 1727204455.79005: done checking for any_errors_fatal 41684 1727204455.79005: checking for max_fail_percentage 41684 1727204455.79007: done checking for max_fail_percentage 41684 1727204455.79008: checking to see if all hosts have failed and the running result is not ok 41684 1727204455.79009: done checking to see if all hosts have failed 41684 1727204455.79009: getting the remaining hosts for this loop 41684 1727204455.79011: done getting the remaining hosts for this loop 41684 1727204455.79015: getting the next task for host managed-node1 41684 1727204455.79021: done getting next task for host managed-node1 41684 1727204455.79024: ^ task is: TASK: Set current_interfaces 41684 1727204455.79029: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204455.79032: getting variables 41684 1727204455.79033: in VariableManager get_vars() 41684 1727204455.79130: Calling all_inventory to load vars for managed-node1 41684 1727204455.79133: Calling groups_inventory to load vars for managed-node1 41684 1727204455.79134: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204455.79142: Calling all_plugins_play to load vars for managed-node1 41684 1727204455.79144: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204455.79146: Calling groups_plugins_play to load vars for managed-node1 41684 1727204455.79256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204455.79389: done with get_vars() 41684 1727204455.79397: done getting variables 41684 1727204455.79439: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:00:55 -0400 (0:00:00.348) 0:00:12.196 ***** 41684 1727204455.79466: entering _queue_task() for managed-node1/set_fact 41684 1727204455.79660: worker is 1 (out of 1 available) 41684 1727204455.79683: exiting _queue_task() for managed-node1/set_fact 41684 1727204455.79698: done queuing things up, now waiting for results queue to drain 41684 1727204455.79700: waiting for pending results... 41684 1727204455.79853: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 41684 1727204455.79925: in run() - task 0affcd87-79f5-3839-086d-0000000003ac 41684 1727204455.79937: variable 'ansible_search_path' from source: unknown 41684 1727204455.79941: variable 'ansible_search_path' from source: unknown 41684 1727204455.79971: calling self._execute() 41684 1727204455.80038: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.80042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.80050: variable 'omit' from source: magic vars 41684 1727204455.80311: variable 'ansible_distribution_major_version' from source: facts 41684 1727204455.80327: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204455.80330: variable 'omit' from source: magic vars 41684 1727204455.80365: variable 'omit' from source: magic vars 41684 1727204455.80436: variable '_current_interfaces' from source: set_fact 41684 1727204455.80491: variable 'omit' from source: magic vars 41684 1727204455.80522: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204455.80553: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204455.80574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204455.80587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204455.80597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204455.80619: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204455.80622: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.80625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.80702: Set connection var ansible_connection to ssh 41684 1727204455.80706: Set connection var ansible_pipelining to False 41684 1727204455.80711: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204455.80718: Set connection var ansible_timeout to 10 41684 1727204455.80724: Set connection var ansible_shell_executable to /bin/sh 41684 1727204455.80727: Set connection var ansible_shell_type to sh 41684 1727204455.80745: variable 'ansible_shell_executable' from source: unknown 41684 1727204455.80748: variable 'ansible_connection' from source: unknown 41684 1727204455.80750: variable 'ansible_module_compression' from source: unknown 41684 1727204455.80753: variable 'ansible_shell_type' from source: unknown 41684 1727204455.80755: variable 'ansible_shell_executable' from source: unknown 41684 1727204455.80762: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.80770: variable 'ansible_pipelining' from source: unknown 41684 1727204455.80772: variable 'ansible_timeout' from source: unknown 41684 1727204455.80776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.80880: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204455.80884: variable 'omit' from source: magic vars 41684 1727204455.80890: starting attempt loop 41684 1727204455.80893: running the handler 41684 1727204455.80902: handler run complete 41684 1727204455.80910: attempt loop complete, returning result 41684 1727204455.80913: _execute() done 41684 1727204455.80915: dumping result to json 41684 1727204455.80919: done dumping result, returning 41684 1727204455.80926: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [0affcd87-79f5-3839-086d-0000000003ac] 41684 1727204455.80932: sending task result for task 0affcd87-79f5-3839-086d-0000000003ac 41684 1727204455.81018: done sending task result for task 0affcd87-79f5-3839-086d-0000000003ac 41684 1727204455.81021: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "ethtest0", "lo", "peerethtest0", "rpltstbr" ] }, "changed": false } 41684 1727204455.81100: no more pending results, returning what we have 41684 1727204455.81103: results queue empty 41684 1727204455.81104: checking for any_errors_fatal 41684 1727204455.81110: done checking for any_errors_fatal 41684 1727204455.81111: checking for max_fail_percentage 41684 1727204455.81113: done checking for max_fail_percentage 41684 1727204455.81113: checking to see if all hosts have failed and the running result is not ok 41684 1727204455.81114: done checking to see if all hosts have failed 41684 1727204455.81115: getting the remaining hosts for this loop 41684 1727204455.81116: done getting the remaining hosts for this loop 41684 1727204455.81119: getting the next task for host managed-node1 41684 1727204455.81126: done getting next task for host managed-node1 41684 1727204455.81129: ^ task is: TASK: Show current_interfaces 41684 1727204455.81132: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204455.81135: getting variables 41684 1727204455.81136: in VariableManager get_vars() 41684 1727204455.81170: Calling all_inventory to load vars for managed-node1 41684 1727204455.81173: Calling groups_inventory to load vars for managed-node1 41684 1727204455.81174: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204455.81181: Calling all_plugins_play to load vars for managed-node1 41684 1727204455.81183: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204455.81185: Calling groups_plugins_play to load vars for managed-node1 41684 1727204455.81307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204455.81455: done with get_vars() 41684 1727204455.81462: done getting variables 41684 1727204455.81504: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:00:55 -0400 (0:00:00.020) 0:00:12.216 ***** 41684 1727204455.81526: entering _queue_task() for managed-node1/debug 41684 1727204455.81719: worker is 1 (out of 1 available) 41684 1727204455.81732: exiting _queue_task() for managed-node1/debug 41684 1727204455.81745: done queuing things up, now waiting for results queue to drain 41684 1727204455.81746: waiting for pending results... 41684 1727204455.81911: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 41684 1727204455.81979: in run() - task 0affcd87-79f5-3839-086d-000000000375 41684 1727204455.81991: variable 'ansible_search_path' from source: unknown 41684 1727204455.81994: variable 'ansible_search_path' from source: unknown 41684 1727204455.82022: calling self._execute() 41684 1727204455.82090: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.82093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.82102: variable 'omit' from source: magic vars 41684 1727204455.82369: variable 'ansible_distribution_major_version' from source: facts 41684 1727204455.82381: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204455.82387: variable 'omit' from source: magic vars 41684 1727204455.82422: variable 'omit' from source: magic vars 41684 1727204455.82494: variable 'current_interfaces' from source: set_fact 41684 1727204455.82515: variable 'omit' from source: magic vars 41684 1727204455.82551: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204455.82583: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204455.82601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204455.82616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204455.82627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204455.82649: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204455.82653: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.82656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.82726: Set connection var ansible_connection to ssh 41684 1727204455.82730: Set connection var ansible_pipelining to False 41684 1727204455.82736: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204455.82745: Set connection var ansible_timeout to 10 41684 1727204455.82755: Set connection var ansible_shell_executable to /bin/sh 41684 1727204455.82758: Set connection var ansible_shell_type to sh 41684 1727204455.82780: variable 'ansible_shell_executable' from source: unknown 41684 1727204455.82783: variable 'ansible_connection' from source: unknown 41684 1727204455.82785: variable 'ansible_module_compression' from source: unknown 41684 1727204455.82788: variable 'ansible_shell_type' from source: unknown 41684 1727204455.82790: variable 'ansible_shell_executable' from source: unknown 41684 1727204455.82792: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.82794: variable 'ansible_pipelining' from source: unknown 41684 1727204455.82798: variable 'ansible_timeout' from source: unknown 41684 1727204455.82802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.82909: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204455.82918: variable 'omit' from source: magic vars 41684 1727204455.82923: starting attempt loop 41684 1727204455.82925: running the handler 41684 1727204455.82962: handler run complete 41684 1727204455.82977: attempt loop complete, returning result 41684 1727204455.82980: _execute() done 41684 1727204455.82983: dumping result to json 41684 1727204455.82985: done dumping result, returning 41684 1727204455.82992: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [0affcd87-79f5-3839-086d-000000000375] 41684 1727204455.82997: sending task result for task 0affcd87-79f5-3839-086d-000000000375 41684 1727204455.83083: done sending task result for task 0affcd87-79f5-3839-086d-000000000375 41684 1727204455.83086: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'ethtest0', 'lo', 'peerethtest0', 'rpltstbr'] 41684 1727204455.83129: no more pending results, returning what we have 41684 1727204455.83133: results queue empty 41684 1727204455.83134: checking for any_errors_fatal 41684 1727204455.83140: done checking for any_errors_fatal 41684 1727204455.83141: checking for max_fail_percentage 41684 1727204455.83142: done checking for max_fail_percentage 41684 1727204455.83143: checking to see if all hosts have failed and the running result is not ok 41684 1727204455.83144: done checking to see if all hosts have failed 41684 1727204455.83145: getting the remaining hosts for this loop 41684 1727204455.83146: done getting the remaining hosts for this loop 41684 1727204455.83150: getting the next task for host managed-node1 41684 1727204455.83162: done getting next task for host managed-node1 41684 1727204455.83169: ^ task is: TASK: Install iproute 41684 1727204455.83173: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204455.83178: getting variables 41684 1727204455.83179: in VariableManager get_vars() 41684 1727204455.83216: Calling all_inventory to load vars for managed-node1 41684 1727204455.83219: Calling groups_inventory to load vars for managed-node1 41684 1727204455.83221: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204455.83230: Calling all_plugins_play to load vars for managed-node1 41684 1727204455.83232: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204455.83234: Calling groups_plugins_play to load vars for managed-node1 41684 1727204455.83363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204455.83495: done with get_vars() 41684 1727204455.83503: done getting variables 41684 1727204455.83543: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 15:00:55 -0400 (0:00:00.020) 0:00:12.237 ***** 41684 1727204455.83566: entering _queue_task() for managed-node1/package 41684 1727204455.83770: worker is 1 (out of 1 available) 41684 1727204455.83782: exiting _queue_task() for managed-node1/package 41684 1727204455.83796: done queuing things up, now waiting for results queue to drain 41684 1727204455.83798: waiting for pending results... 41684 1727204455.83985: running TaskExecutor() for managed-node1/TASK: Install iproute 41684 1727204455.84049: in run() - task 0affcd87-79f5-3839-086d-0000000002ff 41684 1727204455.84061: variable 'ansible_search_path' from source: unknown 41684 1727204455.84069: variable 'ansible_search_path' from source: unknown 41684 1727204455.84096: calling self._execute() 41684 1727204455.84158: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.84167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.84174: variable 'omit' from source: magic vars 41684 1727204455.84428: variable 'ansible_distribution_major_version' from source: facts 41684 1727204455.84438: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204455.84444: variable 'omit' from source: magic vars 41684 1727204455.84478: variable 'omit' from source: magic vars 41684 1727204455.84614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204455.86671: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204455.86716: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204455.86751: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204455.86779: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204455.86800: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204455.86876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204455.87105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204455.87123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204455.87153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204455.87170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204455.87244: variable '__network_is_ostree' from source: set_fact 41684 1727204455.87248: variable 'omit' from source: magic vars 41684 1727204455.87275: variable 'omit' from source: magic vars 41684 1727204455.87296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204455.87317: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204455.87332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204455.87344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204455.87352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204455.87379: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204455.87382: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.87384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.87447: Set connection var ansible_connection to ssh 41684 1727204455.87452: Set connection var ansible_pipelining to False 41684 1727204455.87457: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204455.87466: Set connection var ansible_timeout to 10 41684 1727204455.87471: Set connection var ansible_shell_executable to /bin/sh 41684 1727204455.87477: Set connection var ansible_shell_type to sh 41684 1727204455.87502: variable 'ansible_shell_executable' from source: unknown 41684 1727204455.87504: variable 'ansible_connection' from source: unknown 41684 1727204455.87507: variable 'ansible_module_compression' from source: unknown 41684 1727204455.87509: variable 'ansible_shell_type' from source: unknown 41684 1727204455.87511: variable 'ansible_shell_executable' from source: unknown 41684 1727204455.87513: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204455.87517: variable 'ansible_pipelining' from source: unknown 41684 1727204455.87519: variable 'ansible_timeout' from source: unknown 41684 1727204455.87523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204455.87594: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204455.87604: variable 'omit' from source: magic vars 41684 1727204455.87609: starting attempt loop 41684 1727204455.87612: running the handler 41684 1727204455.87617: variable 'ansible_facts' from source: unknown 41684 1727204455.87620: variable 'ansible_facts' from source: unknown 41684 1727204455.87647: _low_level_execute_command(): starting 41684 1727204455.87652: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204455.88270: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204455.88273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204455.88276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204455.88279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204455.88398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204455.88401: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204455.88403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.88405: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204455.88407: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204455.88409: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204455.88411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204455.88412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204455.88415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204455.88417: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204455.88418: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204455.88420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.88673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204455.88687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204455.88689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204455.88772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204455.90119: stdout chunk (state=3): >>>/root <<< 41684 1727204455.90230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204455.90328: stderr chunk (state=3): >>><<< 41684 1727204455.90346: stdout chunk (state=3): >>><<< 41684 1727204455.90473: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204455.90477: _low_level_execute_command(): starting 41684 1727204455.90480: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204455.9038122-43078-137393249717363 `" && echo ansible-tmp-1727204455.9038122-43078-137393249717363="` echo /root/.ansible/tmp/ansible-tmp-1727204455.9038122-43078-137393249717363 `" ) && sleep 0' 41684 1727204455.91334: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204455.91349: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204455.91369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204455.92123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204455.92172: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204455.92185: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204455.92199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.92220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204455.92232: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204455.92243: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204455.92255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204455.92274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204455.92291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204455.92304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204455.92314: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204455.92332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.92407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204455.92578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204455.92597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204455.92697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204455.94537: stdout chunk (state=3): >>>ansible-tmp-1727204455.9038122-43078-137393249717363=/root/.ansible/tmp/ansible-tmp-1727204455.9038122-43078-137393249717363 <<< 41684 1727204455.94754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204455.94758: stdout chunk (state=3): >>><<< 41684 1727204455.94760: stderr chunk (state=3): >>><<< 41684 1727204455.94872: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204455.9038122-43078-137393249717363=/root/.ansible/tmp/ansible-tmp-1727204455.9038122-43078-137393249717363 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204455.94876: variable 'ansible_module_compression' from source: unknown 41684 1727204455.94981: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 41684 1727204455.94984: variable 'ansible_facts' from source: unknown 41684 1727204455.95048: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204455.9038122-43078-137393249717363/AnsiballZ_dnf.py 41684 1727204455.95487: Sending initial data 41684 1727204455.95491: Sent initial data (152 bytes) 41684 1727204455.96661: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204455.96684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204455.96701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204455.96720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204455.96771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204455.96784: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204455.96798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.96816: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204455.96829: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204455.96841: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204455.96858: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204455.96878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204455.96895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204455.96912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204455.96924: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204455.96939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204455.97023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204455.97047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204455.97072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204455.97167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204455.98871: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204455.98914: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204455.98970: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpptp_lhzq /root/.ansible/tmp/ansible-tmp-1727204455.9038122-43078-137393249717363/AnsiballZ_dnf.py <<< 41684 1727204455.99020: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204456.00542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204456.00770: stderr chunk (state=3): >>><<< 41684 1727204456.00773: stdout chunk (state=3): >>><<< 41684 1727204456.00776: done transferring module to remote 41684 1727204456.00778: _low_level_execute_command(): starting 41684 1727204456.00780: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204455.9038122-43078-137393249717363/ /root/.ansible/tmp/ansible-tmp-1727204455.9038122-43078-137393249717363/AnsiballZ_dnf.py && sleep 0' 41684 1727204456.01587: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204456.01720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204456.01731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204456.01745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204456.01787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204456.01827: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204456.01836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204456.01850: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204456.01857: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204456.01868: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204456.01874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204456.01884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204456.01895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204456.01902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204456.01934: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204456.01944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204456.02020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204456.02165: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204456.02175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204456.02265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204456.04065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204456.04070: stdout chunk (state=3): >>><<< 41684 1727204456.04073: stderr chunk (state=3): >>><<< 41684 1727204456.04171: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204456.04174: _low_level_execute_command(): starting 41684 1727204456.04177: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204455.9038122-43078-137393249717363/AnsiballZ_dnf.py && sleep 0' 41684 1727204456.05116: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204456.05140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204456.05156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204456.05179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204456.05241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204456.05262: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204456.05281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204456.05300: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204456.05314: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204456.05327: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204456.05340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204456.05361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204456.05381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204456.05392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204456.05402: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204456.05414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204456.05503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204456.05523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204456.05538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204456.05630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204456.97790: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 41684 1727204457.01913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204457.01970: stderr chunk (state=3): >>><<< 41684 1727204457.01974: stdout chunk (state=3): >>><<< 41684 1727204457.01992: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204457.02026: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204455.9038122-43078-137393249717363/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204457.02035: _low_level_execute_command(): starting 41684 1727204457.02040: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204455.9038122-43078-137393249717363/ > /dev/null 2>&1 && sleep 0' 41684 1727204457.02520: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.02524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.02542: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204457.02554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204457.02567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.02610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204457.02622: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204457.02690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204457.04508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204457.04559: stderr chunk (state=3): >>><<< 41684 1727204457.04563: stdout chunk (state=3): >>><<< 41684 1727204457.04585: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204457.04591: handler run complete 41684 1727204457.04710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204457.04839: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204457.04874: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204457.04903: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204457.04926: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204457.04987: variable '__install_status' from source: set_fact 41684 1727204457.05002: Evaluated conditional (__install_status is success): True 41684 1727204457.05017: attempt loop complete, returning result 41684 1727204457.05020: _execute() done 41684 1727204457.05022: dumping result to json 41684 1727204457.05028: done dumping result, returning 41684 1727204457.05034: done running TaskExecutor() for managed-node1/TASK: Install iproute [0affcd87-79f5-3839-086d-0000000002ff] 41684 1727204457.05039: sending task result for task 0affcd87-79f5-3839-086d-0000000002ff 41684 1727204457.05135: done sending task result for task 0affcd87-79f5-3839-086d-0000000002ff 41684 1727204457.05137: WORKER PROCESS EXITING ok: [managed-node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 41684 1727204457.05220: no more pending results, returning what we have 41684 1727204457.05223: results queue empty 41684 1727204457.05224: checking for any_errors_fatal 41684 1727204457.05230: done checking for any_errors_fatal 41684 1727204457.05230: checking for max_fail_percentage 41684 1727204457.05232: done checking for max_fail_percentage 41684 1727204457.05232: checking to see if all hosts have failed and the running result is not ok 41684 1727204457.05233: done checking to see if all hosts have failed 41684 1727204457.05234: getting the remaining hosts for this loop 41684 1727204457.05235: done getting the remaining hosts for this loop 41684 1727204457.05239: getting the next task for host managed-node1 41684 1727204457.05246: done getting next task for host managed-node1 41684 1727204457.05249: ^ task is: TASK: Create veth interface {{ interface }} 41684 1727204457.05251: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204457.05255: getting variables 41684 1727204457.05256: in VariableManager get_vars() 41684 1727204457.05298: Calling all_inventory to load vars for managed-node1 41684 1727204457.05302: Calling groups_inventory to load vars for managed-node1 41684 1727204457.05304: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204457.05313: Calling all_plugins_play to load vars for managed-node1 41684 1727204457.05315: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204457.05318: Calling groups_plugins_play to load vars for managed-node1 41684 1727204457.05490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204457.05620: done with get_vars() 41684 1727204457.05628: done getting variables 41684 1727204457.05676: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204457.05765: variable 'interface' from source: set_fact TASK [Create veth interface ethtest1] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 15:00:57 -0400 (0:00:01.222) 0:00:13.459 ***** 41684 1727204457.05788: entering _queue_task() for managed-node1/command 41684 1727204457.05992: worker is 1 (out of 1 available) 41684 1727204457.06006: exiting _queue_task() for managed-node1/command 41684 1727204457.06019: done queuing things up, now waiting for results queue to drain 41684 1727204457.06020: waiting for pending results... 41684 1727204457.06188: running TaskExecutor() for managed-node1/TASK: Create veth interface ethtest1 41684 1727204457.06250: in run() - task 0affcd87-79f5-3839-086d-000000000300 41684 1727204457.06261: variable 'ansible_search_path' from source: unknown 41684 1727204457.06268: variable 'ansible_search_path' from source: unknown 41684 1727204457.06467: variable 'interface' from source: set_fact 41684 1727204457.06525: variable 'interface' from source: set_fact 41684 1727204457.06575: variable 'interface' from source: set_fact 41684 1727204457.06684: Loaded config def from plugin (lookup/items) 41684 1727204457.06690: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 41684 1727204457.06707: variable 'omit' from source: magic vars 41684 1727204457.06794: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204457.06803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204457.06811: variable 'omit' from source: magic vars 41684 1727204457.06968: variable 'ansible_distribution_major_version' from source: facts 41684 1727204457.06971: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204457.07097: variable 'type' from source: set_fact 41684 1727204457.07101: variable 'state' from source: include params 41684 1727204457.07104: variable 'interface' from source: set_fact 41684 1727204457.07110: variable 'current_interfaces' from source: set_fact 41684 1727204457.07115: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 41684 1727204457.07124: variable 'omit' from source: magic vars 41684 1727204457.07148: variable 'omit' from source: magic vars 41684 1727204457.07182: variable 'item' from source: unknown 41684 1727204457.07233: variable 'item' from source: unknown 41684 1727204457.07246: variable 'omit' from source: magic vars 41684 1727204457.07274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204457.07298: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204457.07312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204457.07325: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204457.07334: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204457.07361: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204457.07371: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204457.07373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204457.07437: Set connection var ansible_connection to ssh 41684 1727204457.07446: Set connection var ansible_pipelining to False 41684 1727204457.07452: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204457.07457: Set connection var ansible_timeout to 10 41684 1727204457.07467: Set connection var ansible_shell_executable to /bin/sh 41684 1727204457.07470: Set connection var ansible_shell_type to sh 41684 1727204457.07484: variable 'ansible_shell_executable' from source: unknown 41684 1727204457.07487: variable 'ansible_connection' from source: unknown 41684 1727204457.07489: variable 'ansible_module_compression' from source: unknown 41684 1727204457.07492: variable 'ansible_shell_type' from source: unknown 41684 1727204457.07494: variable 'ansible_shell_executable' from source: unknown 41684 1727204457.07496: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204457.07502: variable 'ansible_pipelining' from source: unknown 41684 1727204457.07504: variable 'ansible_timeout' from source: unknown 41684 1727204457.07506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204457.07602: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204457.07610: variable 'omit' from source: magic vars 41684 1727204457.07619: starting attempt loop 41684 1727204457.07622: running the handler 41684 1727204457.07631: _low_level_execute_command(): starting 41684 1727204457.07638: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204457.08164: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.08170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.08196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.08204: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.08207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.08248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204457.08260: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204457.08329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204457.09865: stdout chunk (state=3): >>>/root <<< 41684 1727204457.09971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204457.10025: stderr chunk (state=3): >>><<< 41684 1727204457.10028: stdout chunk (state=3): >>><<< 41684 1727204457.10052: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204457.10063: _low_level_execute_command(): starting 41684 1727204457.10076: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204457.100525-43119-263920715747532 `" && echo ansible-tmp-1727204457.100525-43119-263920715747532="` echo /root/.ansible/tmp/ansible-tmp-1727204457.100525-43119-263920715747532 `" ) && sleep 0' 41684 1727204457.10532: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.10544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.10560: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204457.10584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.10598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.10638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204457.10651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204457.10713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204457.12551: stdout chunk (state=3): >>>ansible-tmp-1727204457.100525-43119-263920715747532=/root/.ansible/tmp/ansible-tmp-1727204457.100525-43119-263920715747532 <<< 41684 1727204457.12659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204457.12716: stderr chunk (state=3): >>><<< 41684 1727204457.12722: stdout chunk (state=3): >>><<< 41684 1727204457.12742: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204457.100525-43119-263920715747532=/root/.ansible/tmp/ansible-tmp-1727204457.100525-43119-263920715747532 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204457.12771: variable 'ansible_module_compression' from source: unknown 41684 1727204457.12816: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204457.12848: variable 'ansible_facts' from source: unknown 41684 1727204457.12911: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204457.100525-43119-263920715747532/AnsiballZ_command.py 41684 1727204457.13026: Sending initial data 41684 1727204457.13036: Sent initial data (155 bytes) 41684 1727204457.13732: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.13735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.13771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.13774: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.13784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.13838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204457.13841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204457.13848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204457.13900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204457.15600: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204457.15648: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204457.15703: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpo6kz061u /root/.ansible/tmp/ansible-tmp-1727204457.100525-43119-263920715747532/AnsiballZ_command.py <<< 41684 1727204457.15751: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204457.16597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204457.16713: stderr chunk (state=3): >>><<< 41684 1727204457.16716: stdout chunk (state=3): >>><<< 41684 1727204457.16735: done transferring module to remote 41684 1727204457.16744: _low_level_execute_command(): starting 41684 1727204457.16748: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204457.100525-43119-263920715747532/ /root/.ansible/tmp/ansible-tmp-1727204457.100525-43119-263920715747532/AnsiballZ_command.py && sleep 0' 41684 1727204457.17224: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.17230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.17252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.17270: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.17316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204457.17328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204457.17390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204457.19097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204457.19160: stderr chunk (state=3): >>><<< 41684 1727204457.19163: stdout chunk (state=3): >>><<< 41684 1727204457.19180: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204457.19184: _low_level_execute_command(): starting 41684 1727204457.19189: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204457.100525-43119-263920715747532/AnsiballZ_command.py && sleep 0' 41684 1727204457.19650: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.19667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.19688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204457.19703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.19749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204457.19762: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204457.19832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204457.33825: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1"], "start": "2024-09-24 15:00:57.327444", "end": "2024-09-24 15:00:57.337125", "delta": "0:00:00.009681", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest1 type veth peer name peerethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204457.36417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204457.36422: stderr chunk (state=3): >>><<< 41684 1727204457.36424: stdout chunk (state=3): >>><<< 41684 1727204457.36441: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1"], "start": "2024-09-24 15:00:57.327444", "end": "2024-09-24 15:00:57.337125", "delta": "0:00:00.009681", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest1 type veth peer name peerethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204457.36478: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest1 type veth peer name peerethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204457.100525-43119-263920715747532/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204457.36484: _low_level_execute_command(): starting 41684 1727204457.36489: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204457.100525-43119-263920715747532/ > /dev/null 2>&1 && sleep 0' 41684 1727204457.36954: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.36958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.36989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.36994: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.37001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.37052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204457.37055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204457.37122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204457.40188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204457.40242: stderr chunk (state=3): >>><<< 41684 1727204457.40246: stdout chunk (state=3): >>><<< 41684 1727204457.40471: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204457.40475: handler run complete 41684 1727204457.40477: Evaluated conditional (False): False 41684 1727204457.40480: attempt loop complete, returning result 41684 1727204457.40482: variable 'item' from source: unknown 41684 1727204457.40484: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link add ethtest1 type veth peer name peerethtest1) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1" ], "delta": "0:00:00.009681", "end": "2024-09-24 15:00:57.337125", "item": "ip link add ethtest1 type veth peer name peerethtest1", "rc": 0, "start": "2024-09-24 15:00:57.327444" } 41684 1727204457.40754: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204457.40757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204457.40760: variable 'omit' from source: magic vars 41684 1727204457.40900: variable 'ansible_distribution_major_version' from source: facts 41684 1727204457.40912: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204457.41106: variable 'type' from source: set_fact 41684 1727204457.41116: variable 'state' from source: include params 41684 1727204457.41125: variable 'interface' from source: set_fact 41684 1727204457.41133: variable 'current_interfaces' from source: set_fact 41684 1727204457.41144: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 41684 1727204457.41153: variable 'omit' from source: magic vars 41684 1727204457.41175: variable 'omit' from source: magic vars 41684 1727204457.41223: variable 'item' from source: unknown 41684 1727204457.41289: variable 'item' from source: unknown 41684 1727204457.41312: variable 'omit' from source: magic vars 41684 1727204457.41339: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204457.41354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204457.41367: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204457.41389: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204457.41397: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204457.41409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204457.41494: Set connection var ansible_connection to ssh 41684 1727204457.41505: Set connection var ansible_pipelining to False 41684 1727204457.41519: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204457.41529: Set connection var ansible_timeout to 10 41684 1727204457.41541: Set connection var ansible_shell_executable to /bin/sh 41684 1727204457.41548: Set connection var ansible_shell_type to sh 41684 1727204457.41578: variable 'ansible_shell_executable' from source: unknown 41684 1727204457.41586: variable 'ansible_connection' from source: unknown 41684 1727204457.41593: variable 'ansible_module_compression' from source: unknown 41684 1727204457.41599: variable 'ansible_shell_type' from source: unknown 41684 1727204457.41605: variable 'ansible_shell_executable' from source: unknown 41684 1727204457.41611: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204457.41624: variable 'ansible_pipelining' from source: unknown 41684 1727204457.41631: variable 'ansible_timeout' from source: unknown 41684 1727204457.41639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204457.41742: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204457.41758: variable 'omit' from source: magic vars 41684 1727204457.41770: starting attempt loop 41684 1727204457.41777: running the handler 41684 1727204457.41789: _low_level_execute_command(): starting 41684 1727204457.41797: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204457.42497: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204457.42511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.42525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.42542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.42589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204457.42604: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204457.42618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.42634: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204457.42646: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204457.42657: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204457.42675: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.42688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.42702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.42716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204457.42732: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204457.42745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.42826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204457.42853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204457.42875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204457.42971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204457.44524: stdout chunk (state=3): >>>/root <<< 41684 1727204457.44625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204457.44703: stderr chunk (state=3): >>><<< 41684 1727204457.44705: stdout chunk (state=3): >>><<< 41684 1727204457.44766: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204457.44769: _low_level_execute_command(): starting 41684 1727204457.44774: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204457.4471698-43119-202455657535495 `" && echo ansible-tmp-1727204457.4471698-43119-202455657535495="` echo /root/.ansible/tmp/ansible-tmp-1727204457.4471698-43119-202455657535495 `" ) && sleep 0' 41684 1727204457.45393: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204457.45402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.45412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.45426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.45470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204457.45478: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204457.45488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.45501: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204457.45508: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204457.45514: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204457.45522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.45530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.45541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.45548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204457.45557: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204457.45571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.45647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204457.45666: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204457.45680: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204457.45774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204457.47620: stdout chunk (state=3): >>>ansible-tmp-1727204457.4471698-43119-202455657535495=/root/.ansible/tmp/ansible-tmp-1727204457.4471698-43119-202455657535495 <<< 41684 1727204457.47736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204457.47803: stderr chunk (state=3): >>><<< 41684 1727204457.47807: stdout chunk (state=3): >>><<< 41684 1727204457.47826: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204457.4471698-43119-202455657535495=/root/.ansible/tmp/ansible-tmp-1727204457.4471698-43119-202455657535495 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204457.47873: variable 'ansible_module_compression' from source: unknown 41684 1727204457.47893: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204457.47909: variable 'ansible_facts' from source: unknown 41684 1727204457.47970: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204457.4471698-43119-202455657535495/AnsiballZ_command.py 41684 1727204457.48095: Sending initial data 41684 1727204457.48098: Sent initial data (156 bytes) 41684 1727204457.49050: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204457.49059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.49071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.49086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.49125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204457.49133: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204457.49145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.49156: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204457.49167: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204457.49174: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204457.49181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.49191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.49203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.49210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204457.49217: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204457.49226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.49301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204457.49321: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204457.49335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204457.49419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204457.51140: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204457.51190: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204457.51241: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmptvrjn99b /root/.ansible/tmp/ansible-tmp-1727204457.4471698-43119-202455657535495/AnsiballZ_command.py <<< 41684 1727204457.51295: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204457.52620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204457.52624: stderr chunk (state=3): >>><<< 41684 1727204457.52627: stdout chunk (state=3): >>><<< 41684 1727204457.52629: done transferring module to remote 41684 1727204457.52631: _low_level_execute_command(): starting 41684 1727204457.52633: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204457.4471698-43119-202455657535495/ /root/.ansible/tmp/ansible-tmp-1727204457.4471698-43119-202455657535495/AnsiballZ_command.py && sleep 0' 41684 1727204457.53160: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204457.53185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.53201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.53221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.53270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204457.53286: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204457.53302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.53322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204457.53335: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204457.53345: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204457.53356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.53374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.53390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.53402: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204457.53417: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204457.53431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.53511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204457.53537: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204457.53551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204457.53643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204457.55364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204457.55421: stderr chunk (state=3): >>><<< 41684 1727204457.55425: stdout chunk (state=3): >>><<< 41684 1727204457.55439: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204457.55442: _low_level_execute_command(): starting 41684 1727204457.55447: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204457.4471698-43119-202455657535495/AnsiballZ_command.py && sleep 0' 41684 1727204457.56069: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204457.56080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.56090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.56104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.56141: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204457.56148: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204457.56158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.56177: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204457.56184: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204457.56190: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204457.56198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.56208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.56220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.56227: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204457.56230: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204457.56239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.56313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204457.56331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204457.56344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204457.56430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204457.69749: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest1", "up"], "start": "2024-09-24 15:00:57.693487", "end": "2024-09-24 15:00:57.696690", "delta": "0:00:00.003203", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204457.70815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204457.70881: stderr chunk (state=3): >>><<< 41684 1727204457.70885: stdout chunk (state=3): >>><<< 41684 1727204457.70906: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest1", "up"], "start": "2024-09-24 15:00:57.693487", "end": "2024-09-24 15:00:57.696690", "delta": "0:00:00.003203", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204457.70966: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest1 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204457.4471698-43119-202455657535495/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204457.70970: _low_level_execute_command(): starting 41684 1727204457.70973: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204457.4471698-43119-202455657535495/ > /dev/null 2>&1 && sleep 0' 41684 1727204457.71496: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.71502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.71538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204457.71551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204457.71568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.71611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204457.71623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204457.71690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204457.73425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204457.73482: stderr chunk (state=3): >>><<< 41684 1727204457.73486: stdout chunk (state=3): >>><<< 41684 1727204457.73503: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204457.73508: handler run complete 41684 1727204457.73524: Evaluated conditional (False): False 41684 1727204457.73531: attempt loop complete, returning result 41684 1727204457.73546: variable 'item' from source: unknown 41684 1727204457.73615: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link set peerethtest1 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest1", "up" ], "delta": "0:00:00.003203", "end": "2024-09-24 15:00:57.696690", "item": "ip link set peerethtest1 up", "rc": 0, "start": "2024-09-24 15:00:57.693487" } 41684 1727204457.73734: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204457.73741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204457.73744: variable 'omit' from source: magic vars 41684 1727204457.73872: variable 'ansible_distribution_major_version' from source: facts 41684 1727204457.73879: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204457.74151: variable 'type' from source: set_fact 41684 1727204457.74154: variable 'state' from source: include params 41684 1727204457.74157: variable 'interface' from source: set_fact 41684 1727204457.74159: variable 'current_interfaces' from source: set_fact 41684 1727204457.74161: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 41684 1727204457.74163: variable 'omit' from source: magic vars 41684 1727204457.74168: variable 'omit' from source: magic vars 41684 1727204457.74182: variable 'item' from source: unknown 41684 1727204457.74241: variable 'item' from source: unknown 41684 1727204457.74259: variable 'omit' from source: magic vars 41684 1727204457.74281: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204457.74290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204457.74297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204457.74312: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204457.74315: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204457.74317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204457.74399: Set connection var ansible_connection to ssh 41684 1727204457.74402: Set connection var ansible_pipelining to False 41684 1727204457.74409: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204457.74415: Set connection var ansible_timeout to 10 41684 1727204457.74423: Set connection var ansible_shell_executable to /bin/sh 41684 1727204457.74425: Set connection var ansible_shell_type to sh 41684 1727204457.74452: variable 'ansible_shell_executable' from source: unknown 41684 1727204457.74455: variable 'ansible_connection' from source: unknown 41684 1727204457.74457: variable 'ansible_module_compression' from source: unknown 41684 1727204457.74460: variable 'ansible_shell_type' from source: unknown 41684 1727204457.74462: variable 'ansible_shell_executable' from source: unknown 41684 1727204457.74469: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204457.74472: variable 'ansible_pipelining' from source: unknown 41684 1727204457.74476: variable 'ansible_timeout' from source: unknown 41684 1727204457.74478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204457.74567: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204457.74581: variable 'omit' from source: magic vars 41684 1727204457.74589: starting attempt loop 41684 1727204457.74598: running the handler 41684 1727204457.74609: _low_level_execute_command(): starting 41684 1727204457.74616: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204457.75357: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.75377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.75409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.75412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.75414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.75457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204457.75477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204457.75533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204457.77053: stdout chunk (state=3): >>>/root <<< 41684 1727204457.77166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204457.77223: stderr chunk (state=3): >>><<< 41684 1727204457.77227: stdout chunk (state=3): >>><<< 41684 1727204457.77243: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204457.77251: _low_level_execute_command(): starting 41684 1727204457.77256: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204457.7724254-43119-111134494250358 `" && echo ansible-tmp-1727204457.7724254-43119-111134494250358="` echo /root/.ansible/tmp/ansible-tmp-1727204457.7724254-43119-111134494250358 `" ) && sleep 0' 41684 1727204457.77739: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.77745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.77779: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.77782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.77784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204457.77791: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.77843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204457.77846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204457.77852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204457.77907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204457.79747: stdout chunk (state=3): >>>ansible-tmp-1727204457.7724254-43119-111134494250358=/root/.ansible/tmp/ansible-tmp-1727204457.7724254-43119-111134494250358 <<< 41684 1727204457.79869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204457.79952: stderr chunk (state=3): >>><<< 41684 1727204457.79955: stdout chunk (state=3): >>><<< 41684 1727204457.79973: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204457.7724254-43119-111134494250358=/root/.ansible/tmp/ansible-tmp-1727204457.7724254-43119-111134494250358 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204457.79990: variable 'ansible_module_compression' from source: unknown 41684 1727204457.80024: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204457.80039: variable 'ansible_facts' from source: unknown 41684 1727204457.80093: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204457.7724254-43119-111134494250358/AnsiballZ_command.py 41684 1727204457.80189: Sending initial data 41684 1727204457.80199: Sent initial data (156 bytes) 41684 1727204457.80880: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.80884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.80915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.80919: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204457.80921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.80970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204457.80987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204457.80989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204457.81037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204457.82721: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204457.82773: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204457.82823: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmp1eb1mjqv /root/.ansible/tmp/ansible-tmp-1727204457.7724254-43119-111134494250358/AnsiballZ_command.py <<< 41684 1727204457.82878: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204457.83714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204457.83824: stderr chunk (state=3): >>><<< 41684 1727204457.83829: stdout chunk (state=3): >>><<< 41684 1727204457.83848: done transferring module to remote 41684 1727204457.83855: _low_level_execute_command(): starting 41684 1727204457.83859: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204457.7724254-43119-111134494250358/ /root/.ansible/tmp/ansible-tmp-1727204457.7724254-43119-111134494250358/AnsiballZ_command.py && sleep 0' 41684 1727204457.84397: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204457.84401: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.84403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.84500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.84503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.84506: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204457.84508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204457.84510: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.84617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204457.84669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204457.86367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204457.86417: stderr chunk (state=3): >>><<< 41684 1727204457.86421: stdout chunk (state=3): >>><<< 41684 1727204457.86434: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204457.86442: _low_level_execute_command(): starting 41684 1727204457.86444: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204457.7724254-43119-111134494250358/AnsiballZ_command.py && sleep 0' 41684 1727204457.86907: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204457.86913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204457.86943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.86947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204457.86949: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204457.87007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204457.87010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204457.87016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204457.87076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204458.00631: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest1", "up"], "start": "2024-09-24 15:00:58.000037", "end": "2024-09-24 15:00:58.005516", "delta": "0:00:00.005479", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 41684 1727204458.00657: stdout chunk (state=3): >>> <<< 41684 1727204458.01790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204458.01849: stderr chunk (state=3): >>><<< 41684 1727204458.01852: stdout chunk (state=3): >>><<< 41684 1727204458.01874: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest1", "up"], "start": "2024-09-24 15:00:58.000037", "end": "2024-09-24 15:00:58.005516", "delta": "0:00:00.005479", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204458.01895: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest1 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204457.7724254-43119-111134494250358/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204458.01900: _low_level_execute_command(): starting 41684 1727204458.01905: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204457.7724254-43119-111134494250358/ > /dev/null 2>&1 && sleep 0' 41684 1727204458.02454: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204458.02474: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204458.02490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204458.02510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204458.02552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204458.02567: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204458.02584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.02603: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204458.02619: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204458.02632: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204458.02645: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204458.02659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204458.02680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204458.02695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204458.02706: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204458.02720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.02796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204458.02819: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204458.02837: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204458.02918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204458.04705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204458.04755: stderr chunk (state=3): >>><<< 41684 1727204458.04759: stdout chunk (state=3): >>><<< 41684 1727204458.04778: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204458.04785: handler run complete 41684 1727204458.04801: Evaluated conditional (False): False 41684 1727204458.04808: attempt loop complete, returning result 41684 1727204458.04823: variable 'item' from source: unknown 41684 1727204458.04887: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link set ethtest1 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest1", "up" ], "delta": "0:00:00.005479", "end": "2024-09-24 15:00:58.005516", "item": "ip link set ethtest1 up", "rc": 0, "start": "2024-09-24 15:00:58.000037" } 41684 1727204458.05017: dumping result to json 41684 1727204458.05020: done dumping result, returning 41684 1727204458.05023: done running TaskExecutor() for managed-node1/TASK: Create veth interface ethtest1 [0affcd87-79f5-3839-086d-000000000300] 41684 1727204458.05025: sending task result for task 0affcd87-79f5-3839-086d-000000000300 41684 1727204458.05177: no more pending results, returning what we have 41684 1727204458.05182: results queue empty 41684 1727204458.05182: checking for any_errors_fatal 41684 1727204458.05190: done checking for any_errors_fatal 41684 1727204458.05191: checking for max_fail_percentage 41684 1727204458.05192: done checking for max_fail_percentage 41684 1727204458.05193: checking to see if all hosts have failed and the running result is not ok 41684 1727204458.05194: done checking to see if all hosts have failed 41684 1727204458.05194: getting the remaining hosts for this loop 41684 1727204458.05196: done getting the remaining hosts for this loop 41684 1727204458.05200: getting the next task for host managed-node1 41684 1727204458.05205: done getting next task for host managed-node1 41684 1727204458.05208: ^ task is: TASK: Set up veth as managed by NetworkManager 41684 1727204458.05210: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204458.05214: getting variables 41684 1727204458.05215: in VariableManager get_vars() 41684 1727204458.05254: Calling all_inventory to load vars for managed-node1 41684 1727204458.05257: Calling groups_inventory to load vars for managed-node1 41684 1727204458.05259: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204458.05272: Calling all_plugins_play to load vars for managed-node1 41684 1727204458.05275: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204458.05277: Calling groups_plugins_play to load vars for managed-node1 41684 1727204458.05492: done sending task result for task 0affcd87-79f5-3839-086d-000000000300 41684 1727204458.05496: WORKER PROCESS EXITING 41684 1727204458.05520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204458.05773: done with get_vars() 41684 1727204458.05785: done getting variables 41684 1727204458.05846: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 15:00:58 -0400 (0:00:01.000) 0:00:14.460 ***** 41684 1727204458.05879: entering _queue_task() for managed-node1/command 41684 1727204458.06154: worker is 1 (out of 1 available) 41684 1727204458.06171: exiting _queue_task() for managed-node1/command 41684 1727204458.06184: done queuing things up, now waiting for results queue to drain 41684 1727204458.06185: waiting for pending results... 41684 1727204458.06452: running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager 41684 1727204458.06557: in run() - task 0affcd87-79f5-3839-086d-000000000301 41684 1727204458.06583: variable 'ansible_search_path' from source: unknown 41684 1727204458.06592: variable 'ansible_search_path' from source: unknown 41684 1727204458.06635: calling self._execute() 41684 1727204458.06735: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204458.06767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204458.06781: variable 'omit' from source: magic vars 41684 1727204458.07302: variable 'ansible_distribution_major_version' from source: facts 41684 1727204458.07323: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204458.07523: variable 'type' from source: set_fact 41684 1727204458.07616: variable 'state' from source: include params 41684 1727204458.07635: Evaluated conditional (type == 'veth' and state == 'present'): True 41684 1727204458.07653: variable 'omit' from source: magic vars 41684 1727204458.07693: variable 'omit' from source: magic vars 41684 1727204458.07835: variable 'interface' from source: set_fact 41684 1727204458.07857: variable 'omit' from source: magic vars 41684 1727204458.07905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204458.07951: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204458.07984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204458.08006: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204458.08022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204458.08067: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204458.08078: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204458.08085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204458.08232: Set connection var ansible_connection to ssh 41684 1727204458.08283: Set connection var ansible_pipelining to False 41684 1727204458.08307: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204458.08333: Set connection var ansible_timeout to 10 41684 1727204458.08351: Set connection var ansible_shell_executable to /bin/sh 41684 1727204458.08366: Set connection var ansible_shell_type to sh 41684 1727204458.08400: variable 'ansible_shell_executable' from source: unknown 41684 1727204458.08439: variable 'ansible_connection' from source: unknown 41684 1727204458.08449: variable 'ansible_module_compression' from source: unknown 41684 1727204458.08466: variable 'ansible_shell_type' from source: unknown 41684 1727204458.08470: variable 'ansible_shell_executable' from source: unknown 41684 1727204458.08472: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204458.08486: variable 'ansible_pipelining' from source: unknown 41684 1727204458.08500: variable 'ansible_timeout' from source: unknown 41684 1727204458.08512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204458.08628: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204458.08636: variable 'omit' from source: magic vars 41684 1727204458.08641: starting attempt loop 41684 1727204458.08645: running the handler 41684 1727204458.08658: _low_level_execute_command(): starting 41684 1727204458.08667: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204458.09177: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204458.09198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204458.09212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.09224: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.09276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204458.09290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204458.09376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204458.11183: stdout chunk (state=3): >>>/root <<< 41684 1727204458.11192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204458.11208: stdout chunk (state=3): >>><<< 41684 1727204458.11217: stderr chunk (state=3): >>><<< 41684 1727204458.11244: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204458.11270: _low_level_execute_command(): starting 41684 1727204458.11284: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204458.112515-43168-252885897684271 `" && echo ansible-tmp-1727204458.112515-43168-252885897684271="` echo /root/.ansible/tmp/ansible-tmp-1727204458.112515-43168-252885897684271 `" ) && sleep 0' 41684 1727204458.12573: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204458.12592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204458.12608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204458.12636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204458.12728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204458.12739: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204458.12751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.12776: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204458.12810: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204458.12840: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204458.12874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.12918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204458.12939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204458.12946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204458.13001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204458.14856: stdout chunk (state=3): >>>ansible-tmp-1727204458.112515-43168-252885897684271=/root/.ansible/tmp/ansible-tmp-1727204458.112515-43168-252885897684271 <<< 41684 1727204458.14977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204458.15063: stderr chunk (state=3): >>><<< 41684 1727204458.15084: stdout chunk (state=3): >>><<< 41684 1727204458.15276: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204458.112515-43168-252885897684271=/root/.ansible/tmp/ansible-tmp-1727204458.112515-43168-252885897684271 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204458.15280: variable 'ansible_module_compression' from source: unknown 41684 1727204458.15283: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204458.15286: variable 'ansible_facts' from source: unknown 41684 1727204458.15359: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204458.112515-43168-252885897684271/AnsiballZ_command.py 41684 1727204458.15508: Sending initial data 41684 1727204458.15517: Sent initial data (155 bytes) 41684 1727204458.16194: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204458.16198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204458.16230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.16233: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204458.16236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.16292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204458.16295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204458.16356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204458.18044: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204458.18106: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204458.18157: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpv3bmtb2u /root/.ansible/tmp/ansible-tmp-1727204458.112515-43168-252885897684271/AnsiballZ_command.py <<< 41684 1727204458.18207: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204458.19281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204458.19411: stderr chunk (state=3): >>><<< 41684 1727204458.19414: stdout chunk (state=3): >>><<< 41684 1727204458.19417: done transferring module to remote 41684 1727204458.19419: _low_level_execute_command(): starting 41684 1727204458.19421: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204458.112515-43168-252885897684271/ /root/.ansible/tmp/ansible-tmp-1727204458.112515-43168-252885897684271/AnsiballZ_command.py && sleep 0' 41684 1727204458.19871: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204458.19874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204458.19909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.19912: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204458.19914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.19969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204458.19973: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204458.20037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204458.21829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204458.21832: stdout chunk (state=3): >>><<< 41684 1727204458.21835: stderr chunk (state=3): >>><<< 41684 1727204458.21927: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204458.21930: _low_level_execute_command(): starting 41684 1727204458.21932: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204458.112515-43168-252885897684271/AnsiballZ_command.py && sleep 0' 41684 1727204458.22469: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204458.22483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204458.22498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204458.22517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204458.22560: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204458.22579: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204458.22594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.22611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204458.22623: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204458.22635: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204458.22648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204458.22666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204458.22683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204458.22696: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204458.22708: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204458.22721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.22799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204458.22822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204458.22838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204458.22929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204458.38085: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest1", "managed", "true"], "start": "2024-09-24 15:00:58.358661", "end": "2024-09-24 15:00:58.379638", "delta": "0:00:00.020977", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest1 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204458.39286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204458.39339: stderr chunk (state=3): >>><<< 41684 1727204458.39345: stdout chunk (state=3): >>><<< 41684 1727204458.39360: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest1", "managed", "true"], "start": "2024-09-24 15:00:58.358661", "end": "2024-09-24 15:00:58.379638", "delta": "0:00:00.020977", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest1 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204458.39393: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest1 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204458.112515-43168-252885897684271/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204458.39404: _low_level_execute_command(): starting 41684 1727204458.39407: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204458.112515-43168-252885897684271/ > /dev/null 2>&1 && sleep 0' 41684 1727204458.39892: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204458.39896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204458.39928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.39931: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204458.39933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.39980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204458.39992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204458.40060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204458.41805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204458.41848: stderr chunk (state=3): >>><<< 41684 1727204458.41851: stdout chunk (state=3): >>><<< 41684 1727204458.41868: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204458.41871: handler run complete 41684 1727204458.41889: Evaluated conditional (False): False 41684 1727204458.41899: attempt loop complete, returning result 41684 1727204458.41902: _execute() done 41684 1727204458.41908: dumping result to json 41684 1727204458.41913: done dumping result, returning 41684 1727204458.41923: done running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager [0affcd87-79f5-3839-086d-000000000301] 41684 1727204458.41928: sending task result for task 0affcd87-79f5-3839-086d-000000000301 41684 1727204458.42025: done sending task result for task 0affcd87-79f5-3839-086d-000000000301 41684 1727204458.42028: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest1", "managed", "true" ], "delta": "0:00:00.020977", "end": "2024-09-24 15:00:58.379638", "rc": 0, "start": "2024-09-24 15:00:58.358661" } 41684 1727204458.42096: no more pending results, returning what we have 41684 1727204458.42100: results queue empty 41684 1727204458.42101: checking for any_errors_fatal 41684 1727204458.42115: done checking for any_errors_fatal 41684 1727204458.42115: checking for max_fail_percentage 41684 1727204458.42117: done checking for max_fail_percentage 41684 1727204458.42117: checking to see if all hosts have failed and the running result is not ok 41684 1727204458.42118: done checking to see if all hosts have failed 41684 1727204458.42119: getting the remaining hosts for this loop 41684 1727204458.42121: done getting the remaining hosts for this loop 41684 1727204458.42125: getting the next task for host managed-node1 41684 1727204458.42131: done getting next task for host managed-node1 41684 1727204458.42134: ^ task is: TASK: Delete veth interface {{ interface }} 41684 1727204458.42136: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204458.42140: getting variables 41684 1727204458.42141: in VariableManager get_vars() 41684 1727204458.42187: Calling all_inventory to load vars for managed-node1 41684 1727204458.42190: Calling groups_inventory to load vars for managed-node1 41684 1727204458.42192: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204458.42202: Calling all_plugins_play to load vars for managed-node1 41684 1727204458.42204: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204458.42206: Calling groups_plugins_play to load vars for managed-node1 41684 1727204458.42336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204458.42471: done with get_vars() 41684 1727204458.42480: done getting variables 41684 1727204458.42522: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204458.42612: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest1] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 15:00:58 -0400 (0:00:00.367) 0:00:14.827 ***** 41684 1727204458.42635: entering _queue_task() for managed-node1/command 41684 1727204458.42836: worker is 1 (out of 1 available) 41684 1727204458.42849: exiting _queue_task() for managed-node1/command 41684 1727204458.42866: done queuing things up, now waiting for results queue to drain 41684 1727204458.42867: waiting for pending results... 41684 1727204458.43025: running TaskExecutor() for managed-node1/TASK: Delete veth interface ethtest1 41684 1727204458.43094: in run() - task 0affcd87-79f5-3839-086d-000000000302 41684 1727204458.43131: variable 'ansible_search_path' from source: unknown 41684 1727204458.43138: variable 'ansible_search_path' from source: unknown 41684 1727204458.43188: calling self._execute() 41684 1727204458.43317: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204458.43328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204458.43358: variable 'omit' from source: magic vars 41684 1727204458.44056: variable 'ansible_distribution_major_version' from source: facts 41684 1727204458.44076: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204458.44273: variable 'type' from source: set_fact 41684 1727204458.44284: variable 'state' from source: include params 41684 1727204458.44294: variable 'interface' from source: set_fact 41684 1727204458.44304: variable 'current_interfaces' from source: set_fact 41684 1727204458.44315: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 41684 1727204458.44322: when evaluation is False, skipping this task 41684 1727204458.44329: _execute() done 41684 1727204458.44335: dumping result to json 41684 1727204458.44340: done dumping result, returning 41684 1727204458.44347: done running TaskExecutor() for managed-node1/TASK: Delete veth interface ethtest1 [0affcd87-79f5-3839-086d-000000000302] 41684 1727204458.44355: sending task result for task 0affcd87-79f5-3839-086d-000000000302 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 41684 1727204458.44492: no more pending results, returning what we have 41684 1727204458.44497: results queue empty 41684 1727204458.44498: checking for any_errors_fatal 41684 1727204458.44513: done checking for any_errors_fatal 41684 1727204458.44514: checking for max_fail_percentage 41684 1727204458.44516: done checking for max_fail_percentage 41684 1727204458.44516: checking to see if all hosts have failed and the running result is not ok 41684 1727204458.44517: done checking to see if all hosts have failed 41684 1727204458.44518: getting the remaining hosts for this loop 41684 1727204458.44519: done getting the remaining hosts for this loop 41684 1727204458.44523: getting the next task for host managed-node1 41684 1727204458.44529: done getting next task for host managed-node1 41684 1727204458.44532: ^ task is: TASK: Create dummy interface {{ interface }} 41684 1727204458.44534: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204458.44538: getting variables 41684 1727204458.44540: in VariableManager get_vars() 41684 1727204458.44586: Calling all_inventory to load vars for managed-node1 41684 1727204458.44589: Calling groups_inventory to load vars for managed-node1 41684 1727204458.44591: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204458.44604: Calling all_plugins_play to load vars for managed-node1 41684 1727204458.44606: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204458.44610: Calling groups_plugins_play to load vars for managed-node1 41684 1727204458.44790: done sending task result for task 0affcd87-79f5-3839-086d-000000000302 41684 1727204458.44794: WORKER PROCESS EXITING 41684 1727204458.44817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204458.45103: done with get_vars() 41684 1727204458.45114: done getting variables 41684 1727204458.45183: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204458.45301: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest1] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 15:00:58 -0400 (0:00:00.026) 0:00:14.854 ***** 41684 1727204458.45330: entering _queue_task() for managed-node1/command 41684 1727204458.45588: worker is 1 (out of 1 available) 41684 1727204458.45608: exiting _queue_task() for managed-node1/command 41684 1727204458.45622: done queuing things up, now waiting for results queue to drain 41684 1727204458.45624: waiting for pending results... 41684 1727204458.45908: running TaskExecutor() for managed-node1/TASK: Create dummy interface ethtest1 41684 1727204458.46018: in run() - task 0affcd87-79f5-3839-086d-000000000303 41684 1727204458.46041: variable 'ansible_search_path' from source: unknown 41684 1727204458.46053: variable 'ansible_search_path' from source: unknown 41684 1727204458.46103: calling self._execute() 41684 1727204458.46202: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204458.46214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204458.46228: variable 'omit' from source: magic vars 41684 1727204458.46614: variable 'ansible_distribution_major_version' from source: facts 41684 1727204458.46633: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204458.46858: variable 'type' from source: set_fact 41684 1727204458.46874: variable 'state' from source: include params 41684 1727204458.46886: variable 'interface' from source: set_fact 41684 1727204458.46894: variable 'current_interfaces' from source: set_fact 41684 1727204458.46906: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 41684 1727204458.46913: when evaluation is False, skipping this task 41684 1727204458.46927: _execute() done 41684 1727204458.46936: dumping result to json 41684 1727204458.46944: done dumping result, returning 41684 1727204458.46954: done running TaskExecutor() for managed-node1/TASK: Create dummy interface ethtest1 [0affcd87-79f5-3839-086d-000000000303] 41684 1727204458.46969: sending task result for task 0affcd87-79f5-3839-086d-000000000303 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 41684 1727204458.47121: no more pending results, returning what we have 41684 1727204458.47125: results queue empty 41684 1727204458.47126: checking for any_errors_fatal 41684 1727204458.47134: done checking for any_errors_fatal 41684 1727204458.47135: checking for max_fail_percentage 41684 1727204458.47136: done checking for max_fail_percentage 41684 1727204458.47137: checking to see if all hosts have failed and the running result is not ok 41684 1727204458.47138: done checking to see if all hosts have failed 41684 1727204458.47139: getting the remaining hosts for this loop 41684 1727204458.47140: done getting the remaining hosts for this loop 41684 1727204458.47144: getting the next task for host managed-node1 41684 1727204458.47151: done getting next task for host managed-node1 41684 1727204458.47154: ^ task is: TASK: Delete dummy interface {{ interface }} 41684 1727204458.47157: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204458.47167: getting variables 41684 1727204458.47170: in VariableManager get_vars() 41684 1727204458.47213: Calling all_inventory to load vars for managed-node1 41684 1727204458.47216: Calling groups_inventory to load vars for managed-node1 41684 1727204458.47219: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204458.47232: Calling all_plugins_play to load vars for managed-node1 41684 1727204458.47234: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204458.47237: Calling groups_plugins_play to load vars for managed-node1 41684 1727204458.47444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204458.47681: done with get_vars() 41684 1727204458.47692: done getting variables 41684 1727204458.47872: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204458.48034: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest1] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 15:00:58 -0400 (0:00:00.027) 0:00:14.882 ***** 41684 1727204458.48085: entering _queue_task() for managed-node1/command 41684 1727204458.48141: done sending task result for task 0affcd87-79f5-3839-086d-000000000303 41684 1727204458.48145: WORKER PROCESS EXITING 41684 1727204458.48291: worker is 1 (out of 1 available) 41684 1727204458.48307: exiting _queue_task() for managed-node1/command 41684 1727204458.48320: done queuing things up, now waiting for results queue to drain 41684 1727204458.48321: waiting for pending results... 41684 1727204458.48484: running TaskExecutor() for managed-node1/TASK: Delete dummy interface ethtest1 41684 1727204458.48549: in run() - task 0affcd87-79f5-3839-086d-000000000304 41684 1727204458.48561: variable 'ansible_search_path' from source: unknown 41684 1727204458.48566: variable 'ansible_search_path' from source: unknown 41684 1727204458.48595: calling self._execute() 41684 1727204458.48658: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204458.48662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204458.48675: variable 'omit' from source: magic vars 41684 1727204458.48920: variable 'ansible_distribution_major_version' from source: facts 41684 1727204458.48931: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204458.49068: variable 'type' from source: set_fact 41684 1727204458.49072: variable 'state' from source: include params 41684 1727204458.49075: variable 'interface' from source: set_fact 41684 1727204458.49078: variable 'current_interfaces' from source: set_fact 41684 1727204458.49087: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 41684 1727204458.49089: when evaluation is False, skipping this task 41684 1727204458.49092: _execute() done 41684 1727204458.49095: dumping result to json 41684 1727204458.49098: done dumping result, returning 41684 1727204458.49100: done running TaskExecutor() for managed-node1/TASK: Delete dummy interface ethtest1 [0affcd87-79f5-3839-086d-000000000304] 41684 1727204458.49107: sending task result for task 0affcd87-79f5-3839-086d-000000000304 41684 1727204458.49188: done sending task result for task 0affcd87-79f5-3839-086d-000000000304 41684 1727204458.49191: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 41684 1727204458.49258: no more pending results, returning what we have 41684 1727204458.49261: results queue empty 41684 1727204458.49263: checking for any_errors_fatal 41684 1727204458.49269: done checking for any_errors_fatal 41684 1727204458.49270: checking for max_fail_percentage 41684 1727204458.49272: done checking for max_fail_percentage 41684 1727204458.49272: checking to see if all hosts have failed and the running result is not ok 41684 1727204458.49273: done checking to see if all hosts have failed 41684 1727204458.49274: getting the remaining hosts for this loop 41684 1727204458.49275: done getting the remaining hosts for this loop 41684 1727204458.49278: getting the next task for host managed-node1 41684 1727204458.49283: done getting next task for host managed-node1 41684 1727204458.49286: ^ task is: TASK: Create tap interface {{ interface }} 41684 1727204458.49288: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204458.49292: getting variables 41684 1727204458.49293: in VariableManager get_vars() 41684 1727204458.49326: Calling all_inventory to load vars for managed-node1 41684 1727204458.49328: Calling groups_inventory to load vars for managed-node1 41684 1727204458.49329: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204458.49336: Calling all_plugins_play to load vars for managed-node1 41684 1727204458.49338: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204458.49339: Calling groups_plugins_play to load vars for managed-node1 41684 1727204458.49673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204458.49874: done with get_vars() 41684 1727204458.49883: done getting variables 41684 1727204458.49935: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204458.50036: variable 'interface' from source: set_fact TASK [Create tap interface ethtest1] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 15:00:58 -0400 (0:00:00.019) 0:00:14.902 ***** 41684 1727204458.50066: entering _queue_task() for managed-node1/command 41684 1727204458.50284: worker is 1 (out of 1 available) 41684 1727204458.50297: exiting _queue_task() for managed-node1/command 41684 1727204458.50309: done queuing things up, now waiting for results queue to drain 41684 1727204458.50310: waiting for pending results... 41684 1727204458.50574: running TaskExecutor() for managed-node1/TASK: Create tap interface ethtest1 41684 1727204458.50683: in run() - task 0affcd87-79f5-3839-086d-000000000305 41684 1727204458.50704: variable 'ansible_search_path' from source: unknown 41684 1727204458.50713: variable 'ansible_search_path' from source: unknown 41684 1727204458.50757: calling self._execute() 41684 1727204458.50848: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204458.50863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204458.50880: variable 'omit' from source: magic vars 41684 1727204458.51202: variable 'ansible_distribution_major_version' from source: facts 41684 1727204458.51212: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204458.51345: variable 'type' from source: set_fact 41684 1727204458.51349: variable 'state' from source: include params 41684 1727204458.51352: variable 'interface' from source: set_fact 41684 1727204458.51356: variable 'current_interfaces' from source: set_fact 41684 1727204458.51367: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 41684 1727204458.51371: when evaluation is False, skipping this task 41684 1727204458.51374: _execute() done 41684 1727204458.51376: dumping result to json 41684 1727204458.51379: done dumping result, returning 41684 1727204458.51381: done running TaskExecutor() for managed-node1/TASK: Create tap interface ethtest1 [0affcd87-79f5-3839-086d-000000000305] 41684 1727204458.51388: sending task result for task 0affcd87-79f5-3839-086d-000000000305 41684 1727204458.51469: done sending task result for task 0affcd87-79f5-3839-086d-000000000305 41684 1727204458.51472: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 41684 1727204458.51549: no more pending results, returning what we have 41684 1727204458.51552: results queue empty 41684 1727204458.51553: checking for any_errors_fatal 41684 1727204458.51558: done checking for any_errors_fatal 41684 1727204458.51559: checking for max_fail_percentage 41684 1727204458.51560: done checking for max_fail_percentage 41684 1727204458.51560: checking to see if all hosts have failed and the running result is not ok 41684 1727204458.51565: done checking to see if all hosts have failed 41684 1727204458.51566: getting the remaining hosts for this loop 41684 1727204458.51567: done getting the remaining hosts for this loop 41684 1727204458.51570: getting the next task for host managed-node1 41684 1727204458.51575: done getting next task for host managed-node1 41684 1727204458.51578: ^ task is: TASK: Delete tap interface {{ interface }} 41684 1727204458.51580: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204458.51583: getting variables 41684 1727204458.51584: in VariableManager get_vars() 41684 1727204458.51614: Calling all_inventory to load vars for managed-node1 41684 1727204458.51616: Calling groups_inventory to load vars for managed-node1 41684 1727204458.51617: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204458.51624: Calling all_plugins_play to load vars for managed-node1 41684 1727204458.51626: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204458.51627: Calling groups_plugins_play to load vars for managed-node1 41684 1727204458.51746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204458.51877: done with get_vars() 41684 1727204458.51885: done getting variables 41684 1727204458.51923: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204458.52003: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest1] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 15:00:58 -0400 (0:00:00.019) 0:00:14.921 ***** 41684 1727204458.52024: entering _queue_task() for managed-node1/command 41684 1727204458.52204: worker is 1 (out of 1 available) 41684 1727204458.52218: exiting _queue_task() for managed-node1/command 41684 1727204458.52232: done queuing things up, now waiting for results queue to drain 41684 1727204458.52233: waiting for pending results... 41684 1727204458.52389: running TaskExecutor() for managed-node1/TASK: Delete tap interface ethtest1 41684 1727204458.52457: in run() - task 0affcd87-79f5-3839-086d-000000000306 41684 1727204458.52469: variable 'ansible_search_path' from source: unknown 41684 1727204458.52473: variable 'ansible_search_path' from source: unknown 41684 1727204458.52507: calling self._execute() 41684 1727204458.52568: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204458.52578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204458.52588: variable 'omit' from source: magic vars 41684 1727204458.52845: variable 'ansible_distribution_major_version' from source: facts 41684 1727204458.52855: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204458.53006: variable 'type' from source: set_fact 41684 1727204458.53021: variable 'state' from source: include params 41684 1727204458.53035: variable 'interface' from source: set_fact 41684 1727204458.53045: variable 'current_interfaces' from source: set_fact 41684 1727204458.53057: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 41684 1727204458.53066: when evaluation is False, skipping this task 41684 1727204458.53072: _execute() done 41684 1727204458.53078: dumping result to json 41684 1727204458.53083: done dumping result, returning 41684 1727204458.53091: done running TaskExecutor() for managed-node1/TASK: Delete tap interface ethtest1 [0affcd87-79f5-3839-086d-000000000306] 41684 1727204458.53101: sending task result for task 0affcd87-79f5-3839-086d-000000000306 41684 1727204458.53195: done sending task result for task 0affcd87-79f5-3839-086d-000000000306 41684 1727204458.53202: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 41684 1727204458.53433: no more pending results, returning what we have 41684 1727204458.53437: results queue empty 41684 1727204458.53438: checking for any_errors_fatal 41684 1727204458.53442: done checking for any_errors_fatal 41684 1727204458.53443: checking for max_fail_percentage 41684 1727204458.53445: done checking for max_fail_percentage 41684 1727204458.53445: checking to see if all hosts have failed and the running result is not ok 41684 1727204458.53446: done checking to see if all hosts have failed 41684 1727204458.53447: getting the remaining hosts for this loop 41684 1727204458.53448: done getting the remaining hosts for this loop 41684 1727204458.53452: getting the next task for host managed-node1 41684 1727204458.53458: done getting next task for host managed-node1 41684 1727204458.53461: ^ task is: TASK: Assert device is present 41684 1727204458.53465: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204458.53469: getting variables 41684 1727204458.53470: in VariableManager get_vars() 41684 1727204458.53506: Calling all_inventory to load vars for managed-node1 41684 1727204458.53509: Calling groups_inventory to load vars for managed-node1 41684 1727204458.53511: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204458.53520: Calling all_plugins_play to load vars for managed-node1 41684 1727204458.53523: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204458.53527: Calling groups_plugins_play to load vars for managed-node1 41684 1727204458.53770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204458.53979: done with get_vars() 41684 1727204458.53989: done getting variables TASK [Assert device is present] ************************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:32 Tuesday 24 September 2024 15:00:58 -0400 (0:00:00.020) 0:00:14.942 ***** 41684 1727204458.54080: entering _queue_task() for managed-node1/include_tasks 41684 1727204458.54313: worker is 1 (out of 1 available) 41684 1727204458.54327: exiting _queue_task() for managed-node1/include_tasks 41684 1727204458.54339: done queuing things up, now waiting for results queue to drain 41684 1727204458.54340: waiting for pending results... 41684 1727204458.54604: running TaskExecutor() for managed-node1/TASK: Assert device is present 41684 1727204458.54702: in run() - task 0affcd87-79f5-3839-086d-000000000012 41684 1727204458.54724: variable 'ansible_search_path' from source: unknown 41684 1727204458.54770: calling self._execute() 41684 1727204458.54859: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204458.54873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204458.54892: variable 'omit' from source: magic vars 41684 1727204458.55245: variable 'ansible_distribution_major_version' from source: facts 41684 1727204458.55274: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204458.55278: _execute() done 41684 1727204458.55280: dumping result to json 41684 1727204458.55282: done dumping result, returning 41684 1727204458.55285: done running TaskExecutor() for managed-node1/TASK: Assert device is present [0affcd87-79f5-3839-086d-000000000012] 41684 1727204458.55288: sending task result for task 0affcd87-79f5-3839-086d-000000000012 41684 1727204458.55372: done sending task result for task 0affcd87-79f5-3839-086d-000000000012 41684 1727204458.55375: WORKER PROCESS EXITING 41684 1727204458.55401: no more pending results, returning what we have 41684 1727204458.55406: in VariableManager get_vars() 41684 1727204458.55447: Calling all_inventory to load vars for managed-node1 41684 1727204458.55450: Calling groups_inventory to load vars for managed-node1 41684 1727204458.55452: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204458.55466: Calling all_plugins_play to load vars for managed-node1 41684 1727204458.55468: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204458.55471: Calling groups_plugins_play to load vars for managed-node1 41684 1727204458.55602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204458.55736: done with get_vars() 41684 1727204458.55742: variable 'ansible_search_path' from source: unknown 41684 1727204458.55751: we have included files to process 41684 1727204458.55752: generating all_blocks data 41684 1727204458.55753: done generating all_blocks data 41684 1727204458.55758: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 41684 1727204458.55758: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 41684 1727204458.55760: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 41684 1727204458.55834: in VariableManager get_vars() 41684 1727204458.55849: done with get_vars() 41684 1727204458.55925: done processing included file 41684 1727204458.55927: iterating over new_blocks loaded from include file 41684 1727204458.55928: in VariableManager get_vars() 41684 1727204458.55941: done with get_vars() 41684 1727204458.55942: filtering new block on tags 41684 1727204458.55954: done filtering new block on tags 41684 1727204458.55955: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node1 41684 1727204458.55959: extending task lists for all hosts with included blocks 41684 1727204458.56670: done extending task lists 41684 1727204458.56671: done processing included files 41684 1727204458.56672: results queue empty 41684 1727204458.56672: checking for any_errors_fatal 41684 1727204458.56675: done checking for any_errors_fatal 41684 1727204458.56675: checking for max_fail_percentage 41684 1727204458.56676: done checking for max_fail_percentage 41684 1727204458.56676: checking to see if all hosts have failed and the running result is not ok 41684 1727204458.56677: done checking to see if all hosts have failed 41684 1727204458.56677: getting the remaining hosts for this loop 41684 1727204458.56678: done getting the remaining hosts for this loop 41684 1727204458.56680: getting the next task for host managed-node1 41684 1727204458.56682: done getting next task for host managed-node1 41684 1727204458.56684: ^ task is: TASK: Include the task 'get_interface_stat.yml' 41684 1727204458.56685: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204458.56687: getting variables 41684 1727204458.56688: in VariableManager get_vars() 41684 1727204458.56699: Calling all_inventory to load vars for managed-node1 41684 1727204458.56700: Calling groups_inventory to load vars for managed-node1 41684 1727204458.56701: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204458.56705: Calling all_plugins_play to load vars for managed-node1 41684 1727204458.56707: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204458.56708: Calling groups_plugins_play to load vars for managed-node1 41684 1727204458.56798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204458.57012: done with get_vars() 41684 1727204458.57023: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:00:58 -0400 (0:00:00.030) 0:00:14.972 ***** 41684 1727204458.57095: entering _queue_task() for managed-node1/include_tasks 41684 1727204458.57350: worker is 1 (out of 1 available) 41684 1727204458.57368: exiting _queue_task() for managed-node1/include_tasks 41684 1727204458.57380: done queuing things up, now waiting for results queue to drain 41684 1727204458.57381: waiting for pending results... 41684 1727204458.57666: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 41684 1727204458.57778: in run() - task 0affcd87-79f5-3839-086d-0000000003eb 41684 1727204458.57800: variable 'ansible_search_path' from source: unknown 41684 1727204458.57812: variable 'ansible_search_path' from source: unknown 41684 1727204458.57861: calling self._execute() 41684 1727204458.57985: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204458.57994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204458.58117: variable 'omit' from source: magic vars 41684 1727204458.58389: variable 'ansible_distribution_major_version' from source: facts 41684 1727204458.58406: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204458.58418: _execute() done 41684 1727204458.58426: dumping result to json 41684 1727204458.58433: done dumping result, returning 41684 1727204458.58449: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-3839-086d-0000000003eb] 41684 1727204458.58470: sending task result for task 0affcd87-79f5-3839-086d-0000000003eb 41684 1727204458.58595: no more pending results, returning what we have 41684 1727204458.58600: in VariableManager get_vars() 41684 1727204458.58646: Calling all_inventory to load vars for managed-node1 41684 1727204458.58649: Calling groups_inventory to load vars for managed-node1 41684 1727204458.58651: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204458.58669: Calling all_plugins_play to load vars for managed-node1 41684 1727204458.58672: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204458.58676: Calling groups_plugins_play to load vars for managed-node1 41684 1727204458.58913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204458.59134: done with get_vars() 41684 1727204458.59141: variable 'ansible_search_path' from source: unknown 41684 1727204458.59142: variable 'ansible_search_path' from source: unknown 41684 1727204458.59192: we have included files to process 41684 1727204458.59193: generating all_blocks data 41684 1727204458.59195: done generating all_blocks data 41684 1727204458.59196: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41684 1727204458.59197: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41684 1727204458.59200: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41684 1727204458.59378: done sending task result for task 0affcd87-79f5-3839-086d-0000000003eb 41684 1727204458.59402: WORKER PROCESS EXITING 41684 1727204458.59517: done processing included file 41684 1727204458.59518: iterating over new_blocks loaded from include file 41684 1727204458.59524: in VariableManager get_vars() 41684 1727204458.59542: done with get_vars() 41684 1727204458.59543: filtering new block on tags 41684 1727204458.59556: done filtering new block on tags 41684 1727204458.59557: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 41684 1727204458.59562: extending task lists for all hosts with included blocks 41684 1727204458.59624: done extending task lists 41684 1727204458.59625: done processing included files 41684 1727204458.59626: results queue empty 41684 1727204458.59626: checking for any_errors_fatal 41684 1727204458.59630: done checking for any_errors_fatal 41684 1727204458.59631: checking for max_fail_percentage 41684 1727204458.59632: done checking for max_fail_percentage 41684 1727204458.59632: checking to see if all hosts have failed and the running result is not ok 41684 1727204458.59633: done checking to see if all hosts have failed 41684 1727204458.59633: getting the remaining hosts for this loop 41684 1727204458.59634: done getting the remaining hosts for this loop 41684 1727204458.59636: getting the next task for host managed-node1 41684 1727204458.59638: done getting next task for host managed-node1 41684 1727204458.59641: ^ task is: TASK: Get stat for interface {{ interface }} 41684 1727204458.59643: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204458.59644: getting variables 41684 1727204458.59645: in VariableManager get_vars() 41684 1727204458.59654: Calling all_inventory to load vars for managed-node1 41684 1727204458.59655: Calling groups_inventory to load vars for managed-node1 41684 1727204458.59656: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204458.59660: Calling all_plugins_play to load vars for managed-node1 41684 1727204458.59661: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204458.59665: Calling groups_plugins_play to load vars for managed-node1 41684 1727204458.59777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204458.59896: done with get_vars() 41684 1727204458.59903: done getting variables 41684 1727204458.60013: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest1] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:00:58 -0400 (0:00:00.029) 0:00:15.001 ***** 41684 1727204458.60034: entering _queue_task() for managed-node1/stat 41684 1727204458.60228: worker is 1 (out of 1 available) 41684 1727204458.60242: exiting _queue_task() for managed-node1/stat 41684 1727204458.60255: done queuing things up, now waiting for results queue to drain 41684 1727204458.60257: waiting for pending results... 41684 1727204458.60417: running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest1 41684 1727204458.60494: in run() - task 0affcd87-79f5-3839-086d-000000000483 41684 1727204458.60505: variable 'ansible_search_path' from source: unknown 41684 1727204458.60509: variable 'ansible_search_path' from source: unknown 41684 1727204458.60536: calling self._execute() 41684 1727204458.60605: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204458.60609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204458.60618: variable 'omit' from source: magic vars 41684 1727204458.60877: variable 'ansible_distribution_major_version' from source: facts 41684 1727204458.60888: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204458.60894: variable 'omit' from source: magic vars 41684 1727204458.60927: variable 'omit' from source: magic vars 41684 1727204458.60998: variable 'interface' from source: set_fact 41684 1727204458.61010: variable 'omit' from source: magic vars 41684 1727204458.61045: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204458.61077: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204458.61107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204458.61146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204458.61167: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204458.61200: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204458.61208: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204458.61214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204458.61327: Set connection var ansible_connection to ssh 41684 1727204458.61338: Set connection var ansible_pipelining to False 41684 1727204458.61349: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204458.61370: Set connection var ansible_timeout to 10 41684 1727204458.61388: Set connection var ansible_shell_executable to /bin/sh 41684 1727204458.61396: Set connection var ansible_shell_type to sh 41684 1727204458.61424: variable 'ansible_shell_executable' from source: unknown 41684 1727204458.61432: variable 'ansible_connection' from source: unknown 41684 1727204458.61440: variable 'ansible_module_compression' from source: unknown 41684 1727204458.61447: variable 'ansible_shell_type' from source: unknown 41684 1727204458.61454: variable 'ansible_shell_executable' from source: unknown 41684 1727204458.61463: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204458.61480: variable 'ansible_pipelining' from source: unknown 41684 1727204458.61493: variable 'ansible_timeout' from source: unknown 41684 1727204458.61501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204458.61720: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204458.61734: variable 'omit' from source: magic vars 41684 1727204458.61743: starting attempt loop 41684 1727204458.61750: running the handler 41684 1727204458.61770: _low_level_execute_command(): starting 41684 1727204458.61783: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204458.62519: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204458.62524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204458.62554: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.62579: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204458.62583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.62629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204458.62633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204458.62638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204458.62693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204458.64233: stdout chunk (state=3): >>>/root <<< 41684 1727204458.64343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204458.64398: stderr chunk (state=3): >>><<< 41684 1727204458.64402: stdout chunk (state=3): >>><<< 41684 1727204458.64425: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204458.64436: _low_level_execute_command(): starting 41684 1727204458.64443: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204458.6442451-43207-261157086999163 `" && echo ansible-tmp-1727204458.6442451-43207-261157086999163="` echo /root/.ansible/tmp/ansible-tmp-1727204458.6442451-43207-261157086999163 `" ) && sleep 0' 41684 1727204458.64896: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204458.64909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204458.64927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204458.64959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.64998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204458.65010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204458.65084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204458.66932: stdout chunk (state=3): >>>ansible-tmp-1727204458.6442451-43207-261157086999163=/root/.ansible/tmp/ansible-tmp-1727204458.6442451-43207-261157086999163 <<< 41684 1727204458.67038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204458.67104: stderr chunk (state=3): >>><<< 41684 1727204458.67110: stdout chunk (state=3): >>><<< 41684 1727204458.67127: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204458.6442451-43207-261157086999163=/root/.ansible/tmp/ansible-tmp-1727204458.6442451-43207-261157086999163 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204458.67170: variable 'ansible_module_compression' from source: unknown 41684 1727204458.67224: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 41684 1727204458.67256: variable 'ansible_facts' from source: unknown 41684 1727204458.67324: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204458.6442451-43207-261157086999163/AnsiballZ_stat.py 41684 1727204458.67435: Sending initial data 41684 1727204458.67443: Sent initial data (153 bytes) 41684 1727204458.68146: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204458.68149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204458.68195: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.68199: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204458.68201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.68251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204458.68254: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204458.68260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204458.68318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204458.70033: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204458.70083: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204458.70138: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmp035vkym0 /root/.ansible/tmp/ansible-tmp-1727204458.6442451-43207-261157086999163/AnsiballZ_stat.py <<< 41684 1727204458.70190: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204458.71031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204458.71149: stderr chunk (state=3): >>><<< 41684 1727204458.71153: stdout chunk (state=3): >>><<< 41684 1727204458.71175: done transferring module to remote 41684 1727204458.71184: _low_level_execute_command(): starting 41684 1727204458.71189: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204458.6442451-43207-261157086999163/ /root/.ansible/tmp/ansible-tmp-1727204458.6442451-43207-261157086999163/AnsiballZ_stat.py && sleep 0' 41684 1727204458.71658: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204458.71679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204458.71691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.71703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.71755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204458.71778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204458.71824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204458.73543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204458.73603: stderr chunk (state=3): >>><<< 41684 1727204458.73606: stdout chunk (state=3): >>><<< 41684 1727204458.73622: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204458.73625: _low_level_execute_command(): starting 41684 1727204458.73630: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204458.6442451-43207-261157086999163/AnsiballZ_stat.py && sleep 0' 41684 1727204458.74087: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204458.74099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204458.74126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41684 1727204458.74139: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.74184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204458.74196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204458.74268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204458.87285: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29203, "dev": 21, "nlink": 1, "atime": 1727204457.3308582, "mtime": 1727204457.3308582, "ctime": 1727204457.3308582, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 41684 1727204458.88298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204458.88303: stdout chunk (state=3): >>><<< 41684 1727204458.88305: stderr chunk (state=3): >>><<< 41684 1727204458.88476: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29203, "dev": 21, "nlink": 1, "atime": 1727204457.3308582, "mtime": 1727204457.3308582, "ctime": 1727204457.3308582, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204458.88480: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204458.6442451-43207-261157086999163/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204458.88483: _low_level_execute_command(): starting 41684 1727204458.88486: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204458.6442451-43207-261157086999163/ > /dev/null 2>&1 && sleep 0' 41684 1727204458.89397: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204458.89401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204458.89435: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204458.89442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204458.89444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204458.89515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204458.89519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204458.89521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204458.89597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204458.91455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204458.91460: stdout chunk (state=3): >>><<< 41684 1727204458.91480: stderr chunk (state=3): >>><<< 41684 1727204458.91680: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204458.91684: handler run complete 41684 1727204458.91687: attempt loop complete, returning result 41684 1727204458.91689: _execute() done 41684 1727204458.91691: dumping result to json 41684 1727204458.91693: done dumping result, returning 41684 1727204458.91695: done running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest1 [0affcd87-79f5-3839-086d-000000000483] 41684 1727204458.91697: sending task result for task 0affcd87-79f5-3839-086d-000000000483 41684 1727204458.91794: done sending task result for task 0affcd87-79f5-3839-086d-000000000483 41684 1727204458.91797: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204457.3308582, "block_size": 4096, "blocks": 0, "ctime": 1727204457.3308582, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 29203, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "mode": "0777", "mtime": 1727204457.3308582, "nlink": 1, "path": "/sys/class/net/ethtest1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 41684 1727204458.91903: no more pending results, returning what we have 41684 1727204458.91908: results queue empty 41684 1727204458.91909: checking for any_errors_fatal 41684 1727204458.91911: done checking for any_errors_fatal 41684 1727204458.91912: checking for max_fail_percentage 41684 1727204458.91913: done checking for max_fail_percentage 41684 1727204458.91915: checking to see if all hosts have failed and the running result is not ok 41684 1727204458.91916: done checking to see if all hosts have failed 41684 1727204458.91916: getting the remaining hosts for this loop 41684 1727204458.91918: done getting the remaining hosts for this loop 41684 1727204458.91923: getting the next task for host managed-node1 41684 1727204458.91932: done getting next task for host managed-node1 41684 1727204458.91935: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 41684 1727204458.91939: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204458.91944: getting variables 41684 1727204458.91946: in VariableManager get_vars() 41684 1727204458.92000: Calling all_inventory to load vars for managed-node1 41684 1727204458.92003: Calling groups_inventory to load vars for managed-node1 41684 1727204458.92005: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204458.92018: Calling all_plugins_play to load vars for managed-node1 41684 1727204458.92021: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204458.92024: Calling groups_plugins_play to load vars for managed-node1 41684 1727204458.92545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204458.92838: done with get_vars() 41684 1727204458.92858: done getting variables 41684 1727204458.92924: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204458.93057: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest1'] *********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:00:58 -0400 (0:00:00.330) 0:00:15.332 ***** 41684 1727204458.93097: entering _queue_task() for managed-node1/assert 41684 1727204458.93406: worker is 1 (out of 1 available) 41684 1727204458.93420: exiting _queue_task() for managed-node1/assert 41684 1727204458.93434: done queuing things up, now waiting for results queue to drain 41684 1727204458.93435: waiting for pending results... 41684 1727204458.93717: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'ethtest1' 41684 1727204458.93823: in run() - task 0affcd87-79f5-3839-086d-0000000003ec 41684 1727204458.93850: variable 'ansible_search_path' from source: unknown 41684 1727204458.93858: variable 'ansible_search_path' from source: unknown 41684 1727204458.93909: calling self._execute() 41684 1727204458.94011: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204458.94022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204458.94037: variable 'omit' from source: magic vars 41684 1727204458.94520: variable 'ansible_distribution_major_version' from source: facts 41684 1727204458.94543: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204458.94555: variable 'omit' from source: magic vars 41684 1727204458.94603: variable 'omit' from source: magic vars 41684 1727204458.94721: variable 'interface' from source: set_fact 41684 1727204458.94745: variable 'omit' from source: magic vars 41684 1727204458.94796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204458.94842: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204458.94878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204458.94900: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204458.94915: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204458.94953: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204458.94967: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204458.94978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204458.95091: Set connection var ansible_connection to ssh 41684 1727204458.95103: Set connection var ansible_pipelining to False 41684 1727204458.95112: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204458.95122: Set connection var ansible_timeout to 10 41684 1727204458.95134: Set connection var ansible_shell_executable to /bin/sh 41684 1727204458.95140: Set connection var ansible_shell_type to sh 41684 1727204458.95178: variable 'ansible_shell_executable' from source: unknown 41684 1727204458.95191: variable 'ansible_connection' from source: unknown 41684 1727204458.95198: variable 'ansible_module_compression' from source: unknown 41684 1727204458.95205: variable 'ansible_shell_type' from source: unknown 41684 1727204458.95212: variable 'ansible_shell_executable' from source: unknown 41684 1727204458.95218: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204458.95225: variable 'ansible_pipelining' from source: unknown 41684 1727204458.95232: variable 'ansible_timeout' from source: unknown 41684 1727204458.95239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204458.95396: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204458.95416: variable 'omit' from source: magic vars 41684 1727204458.95427: starting attempt loop 41684 1727204458.95433: running the handler 41684 1727204458.95597: variable 'interface_stat' from source: set_fact 41684 1727204458.95625: Evaluated conditional (interface_stat.stat.exists): True 41684 1727204458.95635: handler run complete 41684 1727204458.95654: attempt loop complete, returning result 41684 1727204458.95665: _execute() done 41684 1727204458.95674: dumping result to json 41684 1727204458.95682: done dumping result, returning 41684 1727204458.95694: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'ethtest1' [0affcd87-79f5-3839-086d-0000000003ec] 41684 1727204458.95708: sending task result for task 0affcd87-79f5-3839-086d-0000000003ec 41684 1727204458.95821: done sending task result for task 0affcd87-79f5-3839-086d-0000000003ec 41684 1727204458.95828: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 41684 1727204458.95888: no more pending results, returning what we have 41684 1727204458.95892: results queue empty 41684 1727204458.95893: checking for any_errors_fatal 41684 1727204458.95902: done checking for any_errors_fatal 41684 1727204458.95903: checking for max_fail_percentage 41684 1727204458.95905: done checking for max_fail_percentage 41684 1727204458.95905: checking to see if all hosts have failed and the running result is not ok 41684 1727204458.95906: done checking to see if all hosts have failed 41684 1727204458.95907: getting the remaining hosts for this loop 41684 1727204458.95909: done getting the remaining hosts for this loop 41684 1727204458.95913: getting the next task for host managed-node1 41684 1727204458.95925: done getting next task for host managed-node1 41684 1727204458.95931: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41684 1727204458.95934: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204458.95951: getting variables 41684 1727204458.95953: in VariableManager get_vars() 41684 1727204458.96002: Calling all_inventory to load vars for managed-node1 41684 1727204458.96006: Calling groups_inventory to load vars for managed-node1 41684 1727204458.96008: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204458.96020: Calling all_plugins_play to load vars for managed-node1 41684 1727204458.96023: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204458.96026: Calling groups_plugins_play to load vars for managed-node1 41684 1727204458.96296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204458.96583: done with get_vars() 41684 1727204458.96596: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:00:58 -0400 (0:00:00.037) 0:00:15.369 ***** 41684 1727204458.96816: entering _queue_task() for managed-node1/include_tasks 41684 1727204458.97232: worker is 1 (out of 1 available) 41684 1727204458.97248: exiting _queue_task() for managed-node1/include_tasks 41684 1727204458.97270: done queuing things up, now waiting for results queue to drain 41684 1727204458.97273: waiting for pending results... 41684 1727204458.97579: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41684 1727204458.97740: in run() - task 0affcd87-79f5-3839-086d-00000000001b 41684 1727204458.97766: variable 'ansible_search_path' from source: unknown 41684 1727204458.97776: variable 'ansible_search_path' from source: unknown 41684 1727204458.97828: calling self._execute() 41684 1727204458.97922: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204458.97942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204458.97959: variable 'omit' from source: magic vars 41684 1727204458.98341: variable 'ansible_distribution_major_version' from source: facts 41684 1727204458.98361: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204458.98382: _execute() done 41684 1727204458.98390: dumping result to json 41684 1727204458.98397: done dumping result, returning 41684 1727204458.98407: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-3839-086d-00000000001b] 41684 1727204458.98418: sending task result for task 0affcd87-79f5-3839-086d-00000000001b 41684 1727204458.98571: no more pending results, returning what we have 41684 1727204458.98577: in VariableManager get_vars() 41684 1727204458.98632: Calling all_inventory to load vars for managed-node1 41684 1727204458.98635: Calling groups_inventory to load vars for managed-node1 41684 1727204458.98638: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204458.98651: Calling all_plugins_play to load vars for managed-node1 41684 1727204458.98654: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204458.98657: Calling groups_plugins_play to load vars for managed-node1 41684 1727204458.98885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204458.99126: done with get_vars() 41684 1727204458.99136: variable 'ansible_search_path' from source: unknown 41684 1727204458.99138: variable 'ansible_search_path' from source: unknown 41684 1727204458.99196: done sending task result for task 0affcd87-79f5-3839-086d-00000000001b 41684 1727204458.99203: WORKER PROCESS EXITING 41684 1727204458.99301: we have included files to process 41684 1727204458.99302: generating all_blocks data 41684 1727204458.99304: done generating all_blocks data 41684 1727204458.99312: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41684 1727204458.99313: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41684 1727204458.99316: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41684 1727204459.00283: done processing included file 41684 1727204459.00285: iterating over new_blocks loaded from include file 41684 1727204459.00287: in VariableManager get_vars() 41684 1727204459.00313: done with get_vars() 41684 1727204459.00315: filtering new block on tags 41684 1727204459.00332: done filtering new block on tags 41684 1727204459.00335: in VariableManager get_vars() 41684 1727204459.00357: done with get_vars() 41684 1727204459.00358: filtering new block on tags 41684 1727204459.00387: done filtering new block on tags 41684 1727204459.00391: in VariableManager get_vars() 41684 1727204459.00415: done with get_vars() 41684 1727204459.00417: filtering new block on tags 41684 1727204459.00435: done filtering new block on tags 41684 1727204459.00437: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 41684 1727204459.00443: extending task lists for all hosts with included blocks 41684 1727204459.01327: done extending task lists 41684 1727204459.01329: done processing included files 41684 1727204459.01330: results queue empty 41684 1727204459.01331: checking for any_errors_fatal 41684 1727204459.01334: done checking for any_errors_fatal 41684 1727204459.01335: checking for max_fail_percentage 41684 1727204459.01336: done checking for max_fail_percentage 41684 1727204459.01336: checking to see if all hosts have failed and the running result is not ok 41684 1727204459.01337: done checking to see if all hosts have failed 41684 1727204459.01338: getting the remaining hosts for this loop 41684 1727204459.01339: done getting the remaining hosts for this loop 41684 1727204459.01342: getting the next task for host managed-node1 41684 1727204459.01346: done getting next task for host managed-node1 41684 1727204459.01349: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41684 1727204459.01352: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204459.01368: getting variables 41684 1727204459.01370: in VariableManager get_vars() 41684 1727204459.01388: Calling all_inventory to load vars for managed-node1 41684 1727204459.01391: Calling groups_inventory to load vars for managed-node1 41684 1727204459.01393: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204459.01398: Calling all_plugins_play to load vars for managed-node1 41684 1727204459.01401: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204459.01404: Calling groups_plugins_play to load vars for managed-node1 41684 1727204459.01603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204459.01835: done with get_vars() 41684 1727204459.01845: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:00:59 -0400 (0:00:00.051) 0:00:15.420 ***** 41684 1727204459.01930: entering _queue_task() for managed-node1/setup 41684 1727204459.02258: worker is 1 (out of 1 available) 41684 1727204459.02277: exiting _queue_task() for managed-node1/setup 41684 1727204459.02290: done queuing things up, now waiting for results queue to drain 41684 1727204459.02292: waiting for pending results... 41684 1727204459.02594: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41684 1727204459.02754: in run() - task 0affcd87-79f5-3839-086d-00000000049b 41684 1727204459.02780: variable 'ansible_search_path' from source: unknown 41684 1727204459.02792: variable 'ansible_search_path' from source: unknown 41684 1727204459.02835: calling self._execute() 41684 1727204459.02933: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204459.02945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204459.02971: variable 'omit' from source: magic vars 41684 1727204459.03357: variable 'ansible_distribution_major_version' from source: facts 41684 1727204459.03380: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204459.03624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204459.05345: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204459.05394: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204459.05422: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204459.05448: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204459.05471: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204459.05530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204459.05550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204459.05573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204459.05600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204459.05611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204459.05660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204459.05692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204459.05709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204459.05752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204459.05779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204459.05931: variable '__network_required_facts' from source: role '' defaults 41684 1727204459.05946: variable 'ansible_facts' from source: unknown 41684 1727204459.06051: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 41684 1727204459.06059: when evaluation is False, skipping this task 41684 1727204459.06073: _execute() done 41684 1727204459.06080: dumping result to json 41684 1727204459.06087: done dumping result, returning 41684 1727204459.06098: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-3839-086d-00000000049b] 41684 1727204459.06108: sending task result for task 0affcd87-79f5-3839-086d-00000000049b skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41684 1727204459.06257: no more pending results, returning what we have 41684 1727204459.06262: results queue empty 41684 1727204459.06263: checking for any_errors_fatal 41684 1727204459.06266: done checking for any_errors_fatal 41684 1727204459.06267: checking for max_fail_percentage 41684 1727204459.06269: done checking for max_fail_percentage 41684 1727204459.06269: checking to see if all hosts have failed and the running result is not ok 41684 1727204459.06270: done checking to see if all hosts have failed 41684 1727204459.06271: getting the remaining hosts for this loop 41684 1727204459.06272: done getting the remaining hosts for this loop 41684 1727204459.06276: getting the next task for host managed-node1 41684 1727204459.06285: done getting next task for host managed-node1 41684 1727204459.06289: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 41684 1727204459.06292: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204459.06310: getting variables 41684 1727204459.06311: in VariableManager get_vars() 41684 1727204459.06353: Calling all_inventory to load vars for managed-node1 41684 1727204459.06356: Calling groups_inventory to load vars for managed-node1 41684 1727204459.06358: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204459.06373: Calling all_plugins_play to load vars for managed-node1 41684 1727204459.06375: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204459.06379: Calling groups_plugins_play to load vars for managed-node1 41684 1727204459.06559: done sending task result for task 0affcd87-79f5-3839-086d-00000000049b 41684 1727204459.06567: WORKER PROCESS EXITING 41684 1727204459.06582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204459.06845: done with get_vars() 41684 1727204459.06856: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:00:59 -0400 (0:00:00.050) 0:00:15.471 ***** 41684 1727204459.06982: entering _queue_task() for managed-node1/stat 41684 1727204459.07250: worker is 1 (out of 1 available) 41684 1727204459.07268: exiting _queue_task() for managed-node1/stat 41684 1727204459.07280: done queuing things up, now waiting for results queue to drain 41684 1727204459.07281: waiting for pending results... 41684 1727204459.07556: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 41684 1727204459.07670: in run() - task 0affcd87-79f5-3839-086d-00000000049d 41684 1727204459.07680: variable 'ansible_search_path' from source: unknown 41684 1727204459.07684: variable 'ansible_search_path' from source: unknown 41684 1727204459.07712: calling self._execute() 41684 1727204459.07777: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204459.07781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204459.07789: variable 'omit' from source: magic vars 41684 1727204459.08048: variable 'ansible_distribution_major_version' from source: facts 41684 1727204459.08057: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204459.08175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204459.08369: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204459.08402: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204459.08428: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204459.08455: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204459.08520: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204459.08539: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204459.08557: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204459.08578: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204459.08641: variable '__network_is_ostree' from source: set_fact 41684 1727204459.08647: Evaluated conditional (not __network_is_ostree is defined): False 41684 1727204459.08650: when evaluation is False, skipping this task 41684 1727204459.08653: _execute() done 41684 1727204459.08655: dumping result to json 41684 1727204459.08657: done dumping result, returning 41684 1727204459.08667: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-3839-086d-00000000049d] 41684 1727204459.08671: sending task result for task 0affcd87-79f5-3839-086d-00000000049d 41684 1727204459.08749: done sending task result for task 0affcd87-79f5-3839-086d-00000000049d 41684 1727204459.08752: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41684 1727204459.08805: no more pending results, returning what we have 41684 1727204459.08808: results queue empty 41684 1727204459.08809: checking for any_errors_fatal 41684 1727204459.08817: done checking for any_errors_fatal 41684 1727204459.08818: checking for max_fail_percentage 41684 1727204459.08820: done checking for max_fail_percentage 41684 1727204459.08820: checking to see if all hosts have failed and the running result is not ok 41684 1727204459.08821: done checking to see if all hosts have failed 41684 1727204459.08822: getting the remaining hosts for this loop 41684 1727204459.08824: done getting the remaining hosts for this loop 41684 1727204459.08828: getting the next task for host managed-node1 41684 1727204459.08834: done getting next task for host managed-node1 41684 1727204459.08838: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41684 1727204459.08843: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204459.08856: getting variables 41684 1727204459.08857: in VariableManager get_vars() 41684 1727204459.08904: Calling all_inventory to load vars for managed-node1 41684 1727204459.08907: Calling groups_inventory to load vars for managed-node1 41684 1727204459.08909: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204459.08917: Calling all_plugins_play to load vars for managed-node1 41684 1727204459.08920: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204459.08922: Calling groups_plugins_play to load vars for managed-node1 41684 1727204459.09045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204459.09183: done with get_vars() 41684 1727204459.09192: done getting variables 41684 1727204459.09233: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:00:59 -0400 (0:00:00.022) 0:00:15.494 ***** 41684 1727204459.09259: entering _queue_task() for managed-node1/set_fact 41684 1727204459.09517: worker is 1 (out of 1 available) 41684 1727204459.09540: exiting _queue_task() for managed-node1/set_fact 41684 1727204459.09553: done queuing things up, now waiting for results queue to drain 41684 1727204459.09555: waiting for pending results... 41684 1727204459.09836: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41684 1727204459.10006: in run() - task 0affcd87-79f5-3839-086d-00000000049e 41684 1727204459.10026: variable 'ansible_search_path' from source: unknown 41684 1727204459.10034: variable 'ansible_search_path' from source: unknown 41684 1727204459.10081: calling self._execute() 41684 1727204459.10173: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204459.10192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204459.10210: variable 'omit' from source: magic vars 41684 1727204459.10596: variable 'ansible_distribution_major_version' from source: facts 41684 1727204459.10620: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204459.10800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204459.11187: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204459.11234: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204459.11279: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204459.11322: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204459.11423: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204459.11453: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204459.11493: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204459.11535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204459.11638: variable '__network_is_ostree' from source: set_fact 41684 1727204459.11651: Evaluated conditional (not __network_is_ostree is defined): False 41684 1727204459.11658: when evaluation is False, skipping this task 41684 1727204459.11666: _execute() done 41684 1727204459.11676: dumping result to json 41684 1727204459.11683: done dumping result, returning 41684 1727204459.11694: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-3839-086d-00000000049e] 41684 1727204459.11711: sending task result for task 0affcd87-79f5-3839-086d-00000000049e 41684 1727204459.11831: done sending task result for task 0affcd87-79f5-3839-086d-00000000049e skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41684 1727204459.11884: no more pending results, returning what we have 41684 1727204459.11889: results queue empty 41684 1727204459.11890: checking for any_errors_fatal 41684 1727204459.11895: done checking for any_errors_fatal 41684 1727204459.11896: checking for max_fail_percentage 41684 1727204459.11898: done checking for max_fail_percentage 41684 1727204459.11899: checking to see if all hosts have failed and the running result is not ok 41684 1727204459.11900: done checking to see if all hosts have failed 41684 1727204459.11901: getting the remaining hosts for this loop 41684 1727204459.11902: done getting the remaining hosts for this loop 41684 1727204459.11907: getting the next task for host managed-node1 41684 1727204459.11916: done getting next task for host managed-node1 41684 1727204459.11921: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 41684 1727204459.11928: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204459.11943: getting variables 41684 1727204459.11945: in VariableManager get_vars() 41684 1727204459.11990: Calling all_inventory to load vars for managed-node1 41684 1727204459.11994: Calling groups_inventory to load vars for managed-node1 41684 1727204459.11997: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204459.12008: Calling all_plugins_play to load vars for managed-node1 41684 1727204459.12011: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204459.12014: Calling groups_plugins_play to load vars for managed-node1 41684 1727204459.12271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204459.12636: done with get_vars() 41684 1727204459.12647: done getting variables 41684 1727204459.12682: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:00:59 -0400 (0:00:00.035) 0:00:15.529 ***** 41684 1727204459.12831: entering _queue_task() for managed-node1/service_facts 41684 1727204459.12833: Creating lock for service_facts 41684 1727204459.13098: worker is 1 (out of 1 available) 41684 1727204459.13111: exiting _queue_task() for managed-node1/service_facts 41684 1727204459.13123: done queuing things up, now waiting for results queue to drain 41684 1727204459.13125: waiting for pending results... 41684 1727204459.13290: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 41684 1727204459.13382: in run() - task 0affcd87-79f5-3839-086d-0000000004a0 41684 1727204459.13397: variable 'ansible_search_path' from source: unknown 41684 1727204459.13401: variable 'ansible_search_path' from source: unknown 41684 1727204459.13427: calling self._execute() 41684 1727204459.13493: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204459.13499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204459.13507: variable 'omit' from source: magic vars 41684 1727204459.13768: variable 'ansible_distribution_major_version' from source: facts 41684 1727204459.13776: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204459.13782: variable 'omit' from source: magic vars 41684 1727204459.13832: variable 'omit' from source: magic vars 41684 1727204459.13854: variable 'omit' from source: magic vars 41684 1727204459.13887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204459.13915: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204459.13933: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204459.13945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204459.13954: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204459.13979: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204459.13982: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204459.13984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204459.14053: Set connection var ansible_connection to ssh 41684 1727204459.14057: Set connection var ansible_pipelining to False 41684 1727204459.15369: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204459.15377: Set connection var ansible_timeout to 10 41684 1727204459.15380: Set connection var ansible_shell_executable to /bin/sh 41684 1727204459.15382: Set connection var ansible_shell_type to sh 41684 1727204459.15384: variable 'ansible_shell_executable' from source: unknown 41684 1727204459.15385: variable 'ansible_connection' from source: unknown 41684 1727204459.15388: variable 'ansible_module_compression' from source: unknown 41684 1727204459.15393: variable 'ansible_shell_type' from source: unknown 41684 1727204459.15397: variable 'ansible_shell_executable' from source: unknown 41684 1727204459.15399: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204459.15401: variable 'ansible_pipelining' from source: unknown 41684 1727204459.15402: variable 'ansible_timeout' from source: unknown 41684 1727204459.15404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204459.15406: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204459.15409: variable 'omit' from source: magic vars 41684 1727204459.15411: starting attempt loop 41684 1727204459.15413: running the handler 41684 1727204459.15415: _low_level_execute_command(): starting 41684 1727204459.15416: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204459.15418: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204459.15420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204459.15422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204459.15424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204459.15426: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204459.15428: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204459.15430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204459.15432: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204459.15434: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204459.15435: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204459.15437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204459.15439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204459.15441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204459.15443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204459.15445: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204459.15446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204459.15448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204459.15450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204459.15452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204459.15753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204459.17207: stdout chunk (state=3): >>>/root <<< 41684 1727204459.17514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204459.17518: stdout chunk (state=3): >>><<< 41684 1727204459.17521: stderr chunk (state=3): >>><<< 41684 1727204459.17524: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204459.17527: _low_level_execute_command(): starting 41684 1727204459.17530: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204459.1741943-43235-218450032424825 `" && echo ansible-tmp-1727204459.1741943-43235-218450032424825="` echo /root/.ansible/tmp/ansible-tmp-1727204459.1741943-43235-218450032424825 `" ) && sleep 0' 41684 1727204459.18271: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204459.18287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204459.18321: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204459.18324: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204459.18327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204459.18402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204459.18405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204459.18475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204459.20308: stdout chunk (state=3): >>>ansible-tmp-1727204459.1741943-43235-218450032424825=/root/.ansible/tmp/ansible-tmp-1727204459.1741943-43235-218450032424825 <<< 41684 1727204459.20421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204459.20504: stderr chunk (state=3): >>><<< 41684 1727204459.20507: stdout chunk (state=3): >>><<< 41684 1727204459.20571: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204459.1741943-43235-218450032424825=/root/.ansible/tmp/ansible-tmp-1727204459.1741943-43235-218450032424825 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204459.20672: variable 'ansible_module_compression' from source: unknown 41684 1727204459.20675: ANSIBALLZ: Using lock for service_facts 41684 1727204459.20678: ANSIBALLZ: Acquiring lock 41684 1727204459.20680: ANSIBALLZ: Lock acquired: 139842517056688 41684 1727204459.20682: ANSIBALLZ: Creating module 41684 1727204459.34835: ANSIBALLZ: Writing module into payload 41684 1727204459.34959: ANSIBALLZ: Writing module 41684 1727204459.35123: ANSIBALLZ: Renaming module 41684 1727204459.35134: ANSIBALLZ: Done creating module 41684 1727204459.35501: variable 'ansible_facts' from source: unknown 41684 1727204459.35581: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204459.1741943-43235-218450032424825/AnsiballZ_service_facts.py 41684 1727204459.35751: Sending initial data 41684 1727204459.35754: Sent initial data (162 bytes) 41684 1727204459.36818: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204459.36833: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204459.36850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204459.36874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204459.36917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204459.36930: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204459.36952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204459.36979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204459.36993: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204459.37012: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204459.37025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204459.37038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204459.37054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204459.37070: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204459.37082: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204459.37097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204459.37175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204459.37198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204459.37215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204459.37315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204459.39041: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204459.39094: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204459.39148: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmp_890or1n /root/.ansible/tmp/ansible-tmp-1727204459.1741943-43235-218450032424825/AnsiballZ_service_facts.py <<< 41684 1727204459.39609: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204459.40874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204459.40996: stderr chunk (state=3): >>><<< 41684 1727204459.40999: stdout chunk (state=3): >>><<< 41684 1727204459.41002: done transferring module to remote 41684 1727204459.41004: _low_level_execute_command(): starting 41684 1727204459.41006: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204459.1741943-43235-218450032424825/ /root/.ansible/tmp/ansible-tmp-1727204459.1741943-43235-218450032424825/AnsiballZ_service_facts.py && sleep 0' 41684 1727204459.42681: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204459.42704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204459.42720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204459.42739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204459.42790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204459.42828: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204459.42846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204459.42866: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204459.42880: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204459.42893: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204459.42908: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204459.42923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204459.42943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204459.42957: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204459.42972: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204459.42996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204459.43098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204459.43128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204459.43145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204459.43243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204459.45039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204459.45042: stdout chunk (state=3): >>><<< 41684 1727204459.45045: stderr chunk (state=3): >>><<< 41684 1727204459.45142: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204459.45145: _low_level_execute_command(): starting 41684 1727204459.45148: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204459.1741943-43235-218450032424825/AnsiballZ_service_facts.py && sleep 0' 41684 1727204459.46445: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204459.46566: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204459.46569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204459.46607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204459.46610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204459.46613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204459.46981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204459.46992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204459.47077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204460.75694: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "st<<< 41684 1727204460.75739: stdout chunk (state=3): >>>ate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 41684 1727204460.77047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204460.77107: stderr chunk (state=3): >>><<< 41684 1727204460.77111: stdout chunk (state=3): >>><<< 41684 1727204460.77128: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204460.77473: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204459.1741943-43235-218450032424825/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204460.77480: _low_level_execute_command(): starting 41684 1727204460.77484: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204459.1741943-43235-218450032424825/ > /dev/null 2>&1 && sleep 0' 41684 1727204460.78142: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204460.78186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204460.79968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204460.80014: stderr chunk (state=3): >>><<< 41684 1727204460.80018: stdout chunk (state=3): >>><<< 41684 1727204460.80031: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204460.80039: handler run complete 41684 1727204460.80142: variable 'ansible_facts' from source: unknown 41684 1727204460.80225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204460.80471: variable 'ansible_facts' from source: unknown 41684 1727204460.80547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204460.80656: attempt loop complete, returning result 41684 1727204460.80659: _execute() done 41684 1727204460.80661: dumping result to json 41684 1727204460.80700: done dumping result, returning 41684 1727204460.80709: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-3839-086d-0000000004a0] 41684 1727204460.80714: sending task result for task 0affcd87-79f5-3839-086d-0000000004a0 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41684 1727204460.81660: no more pending results, returning what we have 41684 1727204460.81668: results queue empty 41684 1727204460.81669: checking for any_errors_fatal 41684 1727204460.81672: done checking for any_errors_fatal 41684 1727204460.81673: checking for max_fail_percentage 41684 1727204460.81674: done checking for max_fail_percentage 41684 1727204460.81675: checking to see if all hosts have failed and the running result is not ok 41684 1727204460.81676: done checking to see if all hosts have failed 41684 1727204460.81677: getting the remaining hosts for this loop 41684 1727204460.81678: done getting the remaining hosts for this loop 41684 1727204460.81681: getting the next task for host managed-node1 41684 1727204460.81686: done getting next task for host managed-node1 41684 1727204460.81689: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 41684 1727204460.81694: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204460.81703: getting variables 41684 1727204460.81704: in VariableManager get_vars() 41684 1727204460.81735: Calling all_inventory to load vars for managed-node1 41684 1727204460.81738: Calling groups_inventory to load vars for managed-node1 41684 1727204460.81740: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204460.81748: Calling all_plugins_play to load vars for managed-node1 41684 1727204460.81750: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204460.81753: Calling groups_plugins_play to load vars for managed-node1 41684 1727204460.82111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204460.82549: done with get_vars() 41684 1727204460.82558: done getting variables 41684 1727204460.82596: done sending task result for task 0affcd87-79f5-3839-086d-0000000004a0 41684 1727204460.82599: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:01:00 -0400 (0:00:01.698) 0:00:17.228 ***** 41684 1727204460.82651: entering _queue_task() for managed-node1/package_facts 41684 1727204460.82652: Creating lock for package_facts 41684 1727204460.82871: worker is 1 (out of 1 available) 41684 1727204460.82884: exiting _queue_task() for managed-node1/package_facts 41684 1727204460.82898: done queuing things up, now waiting for results queue to drain 41684 1727204460.82899: waiting for pending results... 41684 1727204460.83079: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 41684 1727204460.83175: in run() - task 0affcd87-79f5-3839-086d-0000000004a1 41684 1727204460.83191: variable 'ansible_search_path' from source: unknown 41684 1727204460.83197: variable 'ansible_search_path' from source: unknown 41684 1727204460.83226: calling self._execute() 41684 1727204460.83294: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204460.83299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204460.83311: variable 'omit' from source: magic vars 41684 1727204460.83589: variable 'ansible_distribution_major_version' from source: facts 41684 1727204460.83600: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204460.83606: variable 'omit' from source: magic vars 41684 1727204460.83654: variable 'omit' from source: magic vars 41684 1727204460.83681: variable 'omit' from source: magic vars 41684 1727204460.83713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204460.83742: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204460.83758: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204460.83776: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204460.83784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204460.83807: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204460.83810: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204460.83812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204460.83885: Set connection var ansible_connection to ssh 41684 1727204460.83889: Set connection var ansible_pipelining to False 41684 1727204460.83895: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204460.83900: Set connection var ansible_timeout to 10 41684 1727204460.83907: Set connection var ansible_shell_executable to /bin/sh 41684 1727204460.83910: Set connection var ansible_shell_type to sh 41684 1727204460.83928: variable 'ansible_shell_executable' from source: unknown 41684 1727204460.83931: variable 'ansible_connection' from source: unknown 41684 1727204460.83933: variable 'ansible_module_compression' from source: unknown 41684 1727204460.83937: variable 'ansible_shell_type' from source: unknown 41684 1727204460.83940: variable 'ansible_shell_executable' from source: unknown 41684 1727204460.83942: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204460.83945: variable 'ansible_pipelining' from source: unknown 41684 1727204460.83949: variable 'ansible_timeout' from source: unknown 41684 1727204460.83951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204460.84095: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204460.84102: variable 'omit' from source: magic vars 41684 1727204460.84108: starting attempt loop 41684 1727204460.84111: running the handler 41684 1727204460.84122: _low_level_execute_command(): starting 41684 1727204460.84129: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204460.84637: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204460.84655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204460.84674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204460.84690: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204460.84734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204460.84746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204460.84809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204460.86343: stdout chunk (state=3): >>>/root <<< 41684 1727204460.86443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204460.86499: stderr chunk (state=3): >>><<< 41684 1727204460.86502: stdout chunk (state=3): >>><<< 41684 1727204460.86521: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204460.86531: _low_level_execute_command(): starting 41684 1727204460.86536: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204460.865209-43299-142641344702267 `" && echo ansible-tmp-1727204460.865209-43299-142641344702267="` echo /root/.ansible/tmp/ansible-tmp-1727204460.865209-43299-142641344702267 `" ) && sleep 0' 41684 1727204460.86988: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204460.87006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204460.87020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204460.87033: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204460.87085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204460.87096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204460.87157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204460.88990: stdout chunk (state=3): >>>ansible-tmp-1727204460.865209-43299-142641344702267=/root/.ansible/tmp/ansible-tmp-1727204460.865209-43299-142641344702267 <<< 41684 1727204460.89102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204460.89158: stderr chunk (state=3): >>><<< 41684 1727204460.89161: stdout chunk (state=3): >>><<< 41684 1727204460.89183: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204460.865209-43299-142641344702267=/root/.ansible/tmp/ansible-tmp-1727204460.865209-43299-142641344702267 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204460.89219: variable 'ansible_module_compression' from source: unknown 41684 1727204460.89266: ANSIBALLZ: Using lock for package_facts 41684 1727204460.89270: ANSIBALLZ: Acquiring lock 41684 1727204460.89272: ANSIBALLZ: Lock acquired: 139842517593376 41684 1727204460.89274: ANSIBALLZ: Creating module 41684 1727204461.08627: ANSIBALLZ: Writing module into payload 41684 1727204461.08747: ANSIBALLZ: Writing module 41684 1727204461.08781: ANSIBALLZ: Renaming module 41684 1727204461.08785: ANSIBALLZ: Done creating module 41684 1727204461.08815: variable 'ansible_facts' from source: unknown 41684 1727204461.08952: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204460.865209-43299-142641344702267/AnsiballZ_package_facts.py 41684 1727204461.09076: Sending initial data 41684 1727204461.09087: Sent initial data (161 bytes) 41684 1727204461.09793: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204461.09797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204461.09834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204461.09838: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204461.09842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204461.09845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204461.09902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204461.09905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204461.09907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204461.09974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204461.11689: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 41684 1727204461.11693: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204461.11737: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204461.11797: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpfesg4ejt /root/.ansible/tmp/ansible-tmp-1727204460.865209-43299-142641344702267/AnsiballZ_package_facts.py <<< 41684 1727204461.11842: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204461.13690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204461.13824: stderr chunk (state=3): >>><<< 41684 1727204461.13839: stdout chunk (state=3): >>><<< 41684 1727204461.13852: done transferring module to remote 41684 1727204461.13871: _low_level_execute_command(): starting 41684 1727204461.13876: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204460.865209-43299-142641344702267/ /root/.ansible/tmp/ansible-tmp-1727204460.865209-43299-142641344702267/AnsiballZ_package_facts.py && sleep 0' 41684 1727204461.14347: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204461.14359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204461.14389: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204461.14407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204461.14489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204461.14538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204461.16246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204461.16297: stderr chunk (state=3): >>><<< 41684 1727204461.16300: stdout chunk (state=3): >>><<< 41684 1727204461.16314: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204461.16321: _low_level_execute_command(): starting 41684 1727204461.16324: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204460.865209-43299-142641344702267/AnsiballZ_package_facts.py && sleep 0' 41684 1727204461.16937: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204461.16995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204461.64895: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 41684 1727204461.65090: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 41684 1727204461.65118: stdout chunk (state=3): >>>libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 41684 1727204461.66485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204461.66591: stderr chunk (state=3): >>>Shared connection to 10.31.9.148 closed. <<< 41684 1727204461.66594: stdout chunk (state=3): >>><<< 41684 1727204461.66596: stderr chunk (state=3): >>><<< 41684 1727204461.66672: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204461.73448: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204460.865209-43299-142641344702267/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204461.73466: _low_level_execute_command(): starting 41684 1727204461.73473: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204460.865209-43299-142641344702267/ > /dev/null 2>&1 && sleep 0' 41684 1727204461.73923: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204461.73937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204461.73958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204461.73977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204461.74014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204461.74026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204461.74092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204461.76341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204461.76378: stderr chunk (state=3): >>><<< 41684 1727204461.76386: stdout chunk (state=3): >>><<< 41684 1727204461.76404: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204461.76419: handler run complete 41684 1727204461.77329: variable 'ansible_facts' from source: unknown 41684 1727204461.77604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204461.79771: variable 'ansible_facts' from source: unknown 41684 1727204461.80370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204461.80813: attempt loop complete, returning result 41684 1727204461.80825: _execute() done 41684 1727204461.80829: dumping result to json 41684 1727204461.80956: done dumping result, returning 41684 1727204461.80970: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-3839-086d-0000000004a1] 41684 1727204461.80976: sending task result for task 0affcd87-79f5-3839-086d-0000000004a1 41684 1727204461.82899: done sending task result for task 0affcd87-79f5-3839-086d-0000000004a1 41684 1727204461.82902: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41684 1727204461.82941: no more pending results, returning what we have 41684 1727204461.82943: results queue empty 41684 1727204461.82943: checking for any_errors_fatal 41684 1727204461.82946: done checking for any_errors_fatal 41684 1727204461.82947: checking for max_fail_percentage 41684 1727204461.82948: done checking for max_fail_percentage 41684 1727204461.82948: checking to see if all hosts have failed and the running result is not ok 41684 1727204461.82949: done checking to see if all hosts have failed 41684 1727204461.82949: getting the remaining hosts for this loop 41684 1727204461.82950: done getting the remaining hosts for this loop 41684 1727204461.82953: getting the next task for host managed-node1 41684 1727204461.82957: done getting next task for host managed-node1 41684 1727204461.82960: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 41684 1727204461.82962: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204461.82973: getting variables 41684 1727204461.82974: in VariableManager get_vars() 41684 1727204461.82999: Calling all_inventory to load vars for managed-node1 41684 1727204461.83001: Calling groups_inventory to load vars for managed-node1 41684 1727204461.83002: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204461.83008: Calling all_plugins_play to load vars for managed-node1 41684 1727204461.83010: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204461.83011: Calling groups_plugins_play to load vars for managed-node1 41684 1727204461.83723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204461.84719: done with get_vars() 41684 1727204461.84736: done getting variables 41684 1727204461.84784: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:01:01 -0400 (0:00:01.021) 0:00:18.249 ***** 41684 1727204461.84809: entering _queue_task() for managed-node1/debug 41684 1727204461.85029: worker is 1 (out of 1 available) 41684 1727204461.85042: exiting _queue_task() for managed-node1/debug 41684 1727204461.85056: done queuing things up, now waiting for results queue to drain 41684 1727204461.85057: waiting for pending results... 41684 1727204461.85236: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 41684 1727204461.85326: in run() - task 0affcd87-79f5-3839-086d-00000000001c 41684 1727204461.85339: variable 'ansible_search_path' from source: unknown 41684 1727204461.85342: variable 'ansible_search_path' from source: unknown 41684 1727204461.85376: calling self._execute() 41684 1727204461.85441: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204461.85445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204461.85453: variable 'omit' from source: magic vars 41684 1727204461.85732: variable 'ansible_distribution_major_version' from source: facts 41684 1727204461.85743: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204461.85749: variable 'omit' from source: magic vars 41684 1727204461.85789: variable 'omit' from source: magic vars 41684 1727204461.85867: variable 'network_provider' from source: set_fact 41684 1727204461.85880: variable 'omit' from source: magic vars 41684 1727204461.85915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204461.85943: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204461.85965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204461.85978: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204461.85987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204461.86011: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204461.86014: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204461.86017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204461.86088: Set connection var ansible_connection to ssh 41684 1727204461.86092: Set connection var ansible_pipelining to False 41684 1727204461.86098: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204461.86103: Set connection var ansible_timeout to 10 41684 1727204461.86110: Set connection var ansible_shell_executable to /bin/sh 41684 1727204461.86113: Set connection var ansible_shell_type to sh 41684 1727204461.86131: variable 'ansible_shell_executable' from source: unknown 41684 1727204461.86135: variable 'ansible_connection' from source: unknown 41684 1727204461.86138: variable 'ansible_module_compression' from source: unknown 41684 1727204461.86140: variable 'ansible_shell_type' from source: unknown 41684 1727204461.86144: variable 'ansible_shell_executable' from source: unknown 41684 1727204461.86147: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204461.86150: variable 'ansible_pipelining' from source: unknown 41684 1727204461.86152: variable 'ansible_timeout' from source: unknown 41684 1727204461.86155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204461.86252: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204461.86265: variable 'omit' from source: magic vars 41684 1727204461.86268: starting attempt loop 41684 1727204461.86271: running the handler 41684 1727204461.86307: handler run complete 41684 1727204461.86317: attempt loop complete, returning result 41684 1727204461.86320: _execute() done 41684 1727204461.86322: dumping result to json 41684 1727204461.86324: done dumping result, returning 41684 1727204461.86332: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-3839-086d-00000000001c] 41684 1727204461.86336: sending task result for task 0affcd87-79f5-3839-086d-00000000001c 41684 1727204461.86415: done sending task result for task 0affcd87-79f5-3839-086d-00000000001c 41684 1727204461.86418: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 41684 1727204461.86474: no more pending results, returning what we have 41684 1727204461.86478: results queue empty 41684 1727204461.86479: checking for any_errors_fatal 41684 1727204461.86487: done checking for any_errors_fatal 41684 1727204461.86488: checking for max_fail_percentage 41684 1727204461.86489: done checking for max_fail_percentage 41684 1727204461.86490: checking to see if all hosts have failed and the running result is not ok 41684 1727204461.86491: done checking to see if all hosts have failed 41684 1727204461.86492: getting the remaining hosts for this loop 41684 1727204461.86493: done getting the remaining hosts for this loop 41684 1727204461.86497: getting the next task for host managed-node1 41684 1727204461.86504: done getting next task for host managed-node1 41684 1727204461.86507: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41684 1727204461.86510: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204461.86520: getting variables 41684 1727204461.86522: in VariableManager get_vars() 41684 1727204461.86561: Calling all_inventory to load vars for managed-node1 41684 1727204461.86566: Calling groups_inventory to load vars for managed-node1 41684 1727204461.86568: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204461.86578: Calling all_plugins_play to load vars for managed-node1 41684 1727204461.86580: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204461.86582: Calling groups_plugins_play to load vars for managed-node1 41684 1727204461.87384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204461.88308: done with get_vars() 41684 1727204461.88325: done getting variables 41684 1727204461.88370: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:01:01 -0400 (0:00:00.035) 0:00:18.285 ***** 41684 1727204461.88395: entering _queue_task() for managed-node1/fail 41684 1727204461.88610: worker is 1 (out of 1 available) 41684 1727204461.88623: exiting _queue_task() for managed-node1/fail 41684 1727204461.88637: done queuing things up, now waiting for results queue to drain 41684 1727204461.88639: waiting for pending results... 41684 1727204461.88817: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41684 1727204461.88907: in run() - task 0affcd87-79f5-3839-086d-00000000001d 41684 1727204461.88918: variable 'ansible_search_path' from source: unknown 41684 1727204461.88921: variable 'ansible_search_path' from source: unknown 41684 1727204461.88952: calling self._execute() 41684 1727204461.89020: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204461.89024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204461.89033: variable 'omit' from source: magic vars 41684 1727204461.89314: variable 'ansible_distribution_major_version' from source: facts 41684 1727204461.89324: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204461.89408: variable 'network_state' from source: role '' defaults 41684 1727204461.89416: Evaluated conditional (network_state != {}): False 41684 1727204461.89420: when evaluation is False, skipping this task 41684 1727204461.89423: _execute() done 41684 1727204461.89426: dumping result to json 41684 1727204461.89429: done dumping result, returning 41684 1727204461.89437: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-3839-086d-00000000001d] 41684 1727204461.89443: sending task result for task 0affcd87-79f5-3839-086d-00000000001d 41684 1727204461.89529: done sending task result for task 0affcd87-79f5-3839-086d-00000000001d 41684 1727204461.89532: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41684 1727204461.89581: no more pending results, returning what we have 41684 1727204461.89586: results queue empty 41684 1727204461.89586: checking for any_errors_fatal 41684 1727204461.89594: done checking for any_errors_fatal 41684 1727204461.89595: checking for max_fail_percentage 41684 1727204461.89597: done checking for max_fail_percentage 41684 1727204461.89598: checking to see if all hosts have failed and the running result is not ok 41684 1727204461.89599: done checking to see if all hosts have failed 41684 1727204461.89599: getting the remaining hosts for this loop 41684 1727204461.89601: done getting the remaining hosts for this loop 41684 1727204461.89605: getting the next task for host managed-node1 41684 1727204461.89611: done getting next task for host managed-node1 41684 1727204461.89615: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41684 1727204461.89618: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204461.89632: getting variables 41684 1727204461.89634: in VariableManager get_vars() 41684 1727204461.89674: Calling all_inventory to load vars for managed-node1 41684 1727204461.89677: Calling groups_inventory to load vars for managed-node1 41684 1727204461.89679: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204461.89690: Calling all_plugins_play to load vars for managed-node1 41684 1727204461.89692: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204461.89694: Calling groups_plugins_play to load vars for managed-node1 41684 1727204461.90597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204461.91515: done with get_vars() 41684 1727204461.91532: done getting variables 41684 1727204461.91583: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:01:01 -0400 (0:00:00.032) 0:00:18.317 ***** 41684 1727204461.91607: entering _queue_task() for managed-node1/fail 41684 1727204461.91823: worker is 1 (out of 1 available) 41684 1727204461.91837: exiting _queue_task() for managed-node1/fail 41684 1727204461.91851: done queuing things up, now waiting for results queue to drain 41684 1727204461.91852: waiting for pending results... 41684 1727204461.92027: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41684 1727204461.92118: in run() - task 0affcd87-79f5-3839-086d-00000000001e 41684 1727204461.92129: variable 'ansible_search_path' from source: unknown 41684 1727204461.92132: variable 'ansible_search_path' from source: unknown 41684 1727204461.92166: calling self._execute() 41684 1727204461.92230: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204461.92233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204461.92242: variable 'omit' from source: magic vars 41684 1727204461.92513: variable 'ansible_distribution_major_version' from source: facts 41684 1727204461.92522: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204461.92606: variable 'network_state' from source: role '' defaults 41684 1727204461.92613: Evaluated conditional (network_state != {}): False 41684 1727204461.92616: when evaluation is False, skipping this task 41684 1727204461.92619: _execute() done 41684 1727204461.92623: dumping result to json 41684 1727204461.92626: done dumping result, returning 41684 1727204461.92630: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-3839-086d-00000000001e] 41684 1727204461.92637: sending task result for task 0affcd87-79f5-3839-086d-00000000001e 41684 1727204461.92721: done sending task result for task 0affcd87-79f5-3839-086d-00000000001e 41684 1727204461.92724: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41684 1727204461.92792: no more pending results, returning what we have 41684 1727204461.92796: results queue empty 41684 1727204461.92796: checking for any_errors_fatal 41684 1727204461.92802: done checking for any_errors_fatal 41684 1727204461.92803: checking for max_fail_percentage 41684 1727204461.92804: done checking for max_fail_percentage 41684 1727204461.92805: checking to see if all hosts have failed and the running result is not ok 41684 1727204461.92806: done checking to see if all hosts have failed 41684 1727204461.92807: getting the remaining hosts for this loop 41684 1727204461.92808: done getting the remaining hosts for this loop 41684 1727204461.92812: getting the next task for host managed-node1 41684 1727204461.92817: done getting next task for host managed-node1 41684 1727204461.92821: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41684 1727204461.92824: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204461.92837: getting variables 41684 1727204461.92839: in VariableManager get_vars() 41684 1727204461.92884: Calling all_inventory to load vars for managed-node1 41684 1727204461.92887: Calling groups_inventory to load vars for managed-node1 41684 1727204461.92888: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204461.92895: Calling all_plugins_play to load vars for managed-node1 41684 1727204461.92896: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204461.92898: Calling groups_plugins_play to load vars for managed-node1 41684 1727204461.93707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204461.95078: done with get_vars() 41684 1727204461.95113: done getting variables 41684 1727204461.95184: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:01:01 -0400 (0:00:00.036) 0:00:18.353 ***** 41684 1727204461.95221: entering _queue_task() for managed-node1/fail 41684 1727204461.95533: worker is 1 (out of 1 available) 41684 1727204461.95546: exiting _queue_task() for managed-node1/fail 41684 1727204461.95565: done queuing things up, now waiting for results queue to drain 41684 1727204461.95568: waiting for pending results... 41684 1727204461.95748: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41684 1727204461.95840: in run() - task 0affcd87-79f5-3839-086d-00000000001f 41684 1727204461.95851: variable 'ansible_search_path' from source: unknown 41684 1727204461.95855: variable 'ansible_search_path' from source: unknown 41684 1727204461.95888: calling self._execute() 41684 1727204461.95957: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204461.95961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204461.95977: variable 'omit' from source: magic vars 41684 1727204461.96256: variable 'ansible_distribution_major_version' from source: facts 41684 1727204461.96270: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204461.96394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204461.98660: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204461.98732: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204461.98779: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204461.98829: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204461.98859: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204461.98943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204461.98985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204461.99015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204461.99060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204461.99084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204461.99182: variable 'ansible_distribution_major_version' from source: facts 41684 1727204461.99202: Evaluated conditional (ansible_distribution_major_version | int > 9): False 41684 1727204461.99208: when evaluation is False, skipping this task 41684 1727204461.99215: _execute() done 41684 1727204461.99221: dumping result to json 41684 1727204461.99227: done dumping result, returning 41684 1727204461.99238: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-3839-086d-00000000001f] 41684 1727204461.99249: sending task result for task 0affcd87-79f5-3839-086d-00000000001f skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 41684 1727204461.99400: no more pending results, returning what we have 41684 1727204461.99403: results queue empty 41684 1727204461.99404: checking for any_errors_fatal 41684 1727204461.99410: done checking for any_errors_fatal 41684 1727204461.99411: checking for max_fail_percentage 41684 1727204461.99413: done checking for max_fail_percentage 41684 1727204461.99413: checking to see if all hosts have failed and the running result is not ok 41684 1727204461.99414: done checking to see if all hosts have failed 41684 1727204461.99415: getting the remaining hosts for this loop 41684 1727204461.99417: done getting the remaining hosts for this loop 41684 1727204461.99421: getting the next task for host managed-node1 41684 1727204461.99427: done getting next task for host managed-node1 41684 1727204461.99431: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41684 1727204461.99434: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204461.99451: getting variables 41684 1727204461.99453: in VariableManager get_vars() 41684 1727204461.99497: Calling all_inventory to load vars for managed-node1 41684 1727204461.99500: Calling groups_inventory to load vars for managed-node1 41684 1727204461.99502: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204461.99513: Calling all_plugins_play to load vars for managed-node1 41684 1727204461.99515: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204461.99518: Calling groups_plugins_play to load vars for managed-node1 41684 1727204462.00036: done sending task result for task 0affcd87-79f5-3839-086d-00000000001f 41684 1727204462.00041: WORKER PROCESS EXITING 41684 1727204462.01050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204462.02774: done with get_vars() 41684 1727204462.02804: done getting variables 41684 1727204462.02910: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:01:02 -0400 (0:00:00.077) 0:00:18.430 ***** 41684 1727204462.02941: entering _queue_task() for managed-node1/dnf 41684 1727204462.03266: worker is 1 (out of 1 available) 41684 1727204462.03278: exiting _queue_task() for managed-node1/dnf 41684 1727204462.03292: done queuing things up, now waiting for results queue to drain 41684 1727204462.03293: waiting for pending results... 41684 1727204462.03580: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41684 1727204462.03711: in run() - task 0affcd87-79f5-3839-086d-000000000020 41684 1727204462.03735: variable 'ansible_search_path' from source: unknown 41684 1727204462.03743: variable 'ansible_search_path' from source: unknown 41684 1727204462.03787: calling self._execute() 41684 1727204462.03875: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204462.03886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204462.03898: variable 'omit' from source: magic vars 41684 1727204462.04276: variable 'ansible_distribution_major_version' from source: facts 41684 1727204462.04295: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204462.04509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204462.06914: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204462.07006: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204462.07049: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204462.07093: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204462.07128: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204462.07217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204462.07251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204462.07290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.07341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204462.07367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204462.07496: variable 'ansible_distribution' from source: facts 41684 1727204462.07507: variable 'ansible_distribution_major_version' from source: facts 41684 1727204462.07528: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 41684 1727204462.07666: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204462.07806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204462.07836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204462.07875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.07919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204462.07934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204462.07982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204462.08008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204462.08032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.08078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204462.08099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204462.08138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204462.08166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204462.08197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.08238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204462.08252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204462.08424: variable 'network_connections' from source: task vars 41684 1727204462.08439: variable 'interface0' from source: play vars 41684 1727204462.08517: variable 'interface0' from source: play vars 41684 1727204462.08532: variable 'interface0' from source: play vars 41684 1727204462.08600: variable 'interface0' from source: play vars 41684 1727204462.08616: variable 'interface1' from source: play vars 41684 1727204462.08686: variable 'interface1' from source: play vars 41684 1727204462.08697: variable 'interface1' from source: play vars 41684 1727204462.08755: variable 'interface1' from source: play vars 41684 1727204462.08833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204462.09017: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204462.09060: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204462.09101: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204462.09134: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204462.09199: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204462.09241: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204462.09279: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.09313: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204462.09379: variable '__network_team_connections_defined' from source: role '' defaults 41684 1727204462.09638: variable 'network_connections' from source: task vars 41684 1727204462.09650: variable 'interface0' from source: play vars 41684 1727204462.09720: variable 'interface0' from source: play vars 41684 1727204462.09733: variable 'interface0' from source: play vars 41684 1727204462.09797: variable 'interface0' from source: play vars 41684 1727204462.09813: variable 'interface1' from source: play vars 41684 1727204462.09885: variable 'interface1' from source: play vars 41684 1727204462.09897: variable 'interface1' from source: play vars 41684 1727204462.09965: variable 'interface1' from source: play vars 41684 1727204462.10009: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41684 1727204462.10016: when evaluation is False, skipping this task 41684 1727204462.10023: _execute() done 41684 1727204462.10029: dumping result to json 41684 1727204462.10036: done dumping result, returning 41684 1727204462.10052: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-3839-086d-000000000020] 41684 1727204462.10066: sending task result for task 0affcd87-79f5-3839-086d-000000000020 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41684 1727204462.10220: no more pending results, returning what we have 41684 1727204462.10225: results queue empty 41684 1727204462.10226: checking for any_errors_fatal 41684 1727204462.10233: done checking for any_errors_fatal 41684 1727204462.10233: checking for max_fail_percentage 41684 1727204462.10235: done checking for max_fail_percentage 41684 1727204462.10236: checking to see if all hosts have failed and the running result is not ok 41684 1727204462.10237: done checking to see if all hosts have failed 41684 1727204462.10238: getting the remaining hosts for this loop 41684 1727204462.10240: done getting the remaining hosts for this loop 41684 1727204462.10244: getting the next task for host managed-node1 41684 1727204462.10251: done getting next task for host managed-node1 41684 1727204462.10256: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41684 1727204462.10259: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204462.10277: getting variables 41684 1727204462.10279: in VariableManager get_vars() 41684 1727204462.10324: Calling all_inventory to load vars for managed-node1 41684 1727204462.10327: Calling groups_inventory to load vars for managed-node1 41684 1727204462.10330: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204462.10340: Calling all_plugins_play to load vars for managed-node1 41684 1727204462.10343: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204462.10346: Calling groups_plugins_play to load vars for managed-node1 41684 1727204462.11383: done sending task result for task 0affcd87-79f5-3839-086d-000000000020 41684 1727204462.11387: WORKER PROCESS EXITING 41684 1727204462.12092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204462.13978: done with get_vars() 41684 1727204462.14002: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41684 1727204462.14086: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:01:02 -0400 (0:00:00.111) 0:00:18.542 ***** 41684 1727204462.14119: entering _queue_task() for managed-node1/yum 41684 1727204462.14121: Creating lock for yum 41684 1727204462.14452: worker is 1 (out of 1 available) 41684 1727204462.14468: exiting _queue_task() for managed-node1/yum 41684 1727204462.14482: done queuing things up, now waiting for results queue to drain 41684 1727204462.14483: waiting for pending results... 41684 1727204462.14772: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41684 1727204462.14910: in run() - task 0affcd87-79f5-3839-086d-000000000021 41684 1727204462.14935: variable 'ansible_search_path' from source: unknown 41684 1727204462.14943: variable 'ansible_search_path' from source: unknown 41684 1727204462.14989: calling self._execute() 41684 1727204462.15079: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204462.15091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204462.15104: variable 'omit' from source: magic vars 41684 1727204462.15490: variable 'ansible_distribution_major_version' from source: facts 41684 1727204462.15508: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204462.15702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204462.22303: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204462.22375: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204462.22422: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204462.22459: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204462.22504: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204462.22582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204462.22616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204462.22650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.22700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204462.22719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204462.22822: variable 'ansible_distribution_major_version' from source: facts 41684 1727204462.22847: Evaluated conditional (ansible_distribution_major_version | int < 8): False 41684 1727204462.22855: when evaluation is False, skipping this task 41684 1727204462.22866: _execute() done 41684 1727204462.22872: dumping result to json 41684 1727204462.22877: done dumping result, returning 41684 1727204462.22886: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-3839-086d-000000000021] 41684 1727204462.22893: sending task result for task 0affcd87-79f5-3839-086d-000000000021 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 41684 1727204462.23022: no more pending results, returning what we have 41684 1727204462.23025: results queue empty 41684 1727204462.23026: checking for any_errors_fatal 41684 1727204462.23031: done checking for any_errors_fatal 41684 1727204462.23032: checking for max_fail_percentage 41684 1727204462.23033: done checking for max_fail_percentage 41684 1727204462.23034: checking to see if all hosts have failed and the running result is not ok 41684 1727204462.23035: done checking to see if all hosts have failed 41684 1727204462.23035: getting the remaining hosts for this loop 41684 1727204462.23037: done getting the remaining hosts for this loop 41684 1727204462.23040: getting the next task for host managed-node1 41684 1727204462.23046: done getting next task for host managed-node1 41684 1727204462.23050: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41684 1727204462.23052: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204462.23069: getting variables 41684 1727204462.23071: in VariableManager get_vars() 41684 1727204462.23111: Calling all_inventory to load vars for managed-node1 41684 1727204462.23115: Calling groups_inventory to load vars for managed-node1 41684 1727204462.23117: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204462.23127: Calling all_plugins_play to load vars for managed-node1 41684 1727204462.23130: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204462.23133: Calling groups_plugins_play to load vars for managed-node1 41684 1727204462.27388: done sending task result for task 0affcd87-79f5-3839-086d-000000000021 41684 1727204462.27392: WORKER PROCESS EXITING 41684 1727204462.28317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204462.29776: done with get_vars() 41684 1727204462.29798: done getting variables 41684 1727204462.29836: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:01:02 -0400 (0:00:00.157) 0:00:18.700 ***** 41684 1727204462.29857: entering _queue_task() for managed-node1/fail 41684 1727204462.30089: worker is 1 (out of 1 available) 41684 1727204462.30102: exiting _queue_task() for managed-node1/fail 41684 1727204462.30116: done queuing things up, now waiting for results queue to drain 41684 1727204462.30118: waiting for pending results... 41684 1727204462.30293: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41684 1727204462.30394: in run() - task 0affcd87-79f5-3839-086d-000000000022 41684 1727204462.30407: variable 'ansible_search_path' from source: unknown 41684 1727204462.30410: variable 'ansible_search_path' from source: unknown 41684 1727204462.30440: calling self._execute() 41684 1727204462.30511: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204462.30515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204462.30524: variable 'omit' from source: magic vars 41684 1727204462.30801: variable 'ansible_distribution_major_version' from source: facts 41684 1727204462.30813: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204462.30899: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204462.31036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204462.33627: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204462.33715: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204462.33762: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204462.33803: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204462.33835: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204462.33922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204462.33956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204462.33995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.34042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204462.34060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204462.34116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204462.34143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204462.34173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.34223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204462.34240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204462.34293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204462.34320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204462.34386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.34589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204462.34947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204462.35147: variable 'network_connections' from source: task vars 41684 1727204462.35166: variable 'interface0' from source: play vars 41684 1727204462.35256: variable 'interface0' from source: play vars 41684 1727204462.35274: variable 'interface0' from source: play vars 41684 1727204462.35345: variable 'interface0' from source: play vars 41684 1727204462.35367: variable 'interface1' from source: play vars 41684 1727204462.35438: variable 'interface1' from source: play vars 41684 1727204462.35450: variable 'interface1' from source: play vars 41684 1727204462.35529: variable 'interface1' from source: play vars 41684 1727204462.35610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204462.36013: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204462.36056: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204462.36097: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204462.36128: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204462.36185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204462.36214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204462.36244: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.36276: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204462.36347: variable '__network_team_connections_defined' from source: role '' defaults 41684 1727204462.36627: variable 'network_connections' from source: task vars 41684 1727204462.36637: variable 'interface0' from source: play vars 41684 1727204462.36702: variable 'interface0' from source: play vars 41684 1727204462.36714: variable 'interface0' from source: play vars 41684 1727204462.36783: variable 'interface0' from source: play vars 41684 1727204462.36800: variable 'interface1' from source: play vars 41684 1727204462.36989: variable 'interface1' from source: play vars 41684 1727204462.37028: variable 'interface1' from source: play vars 41684 1727204462.37190: variable 'interface1' from source: play vars 41684 1727204462.37317: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41684 1727204462.37320: when evaluation is False, skipping this task 41684 1727204462.37323: _execute() done 41684 1727204462.37326: dumping result to json 41684 1727204462.37328: done dumping result, returning 41684 1727204462.37334: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-3839-086d-000000000022] 41684 1727204462.37358: sending task result for task 0affcd87-79f5-3839-086d-000000000022 41684 1727204462.37461: done sending task result for task 0affcd87-79f5-3839-086d-000000000022 41684 1727204462.37476: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41684 1727204462.37547: no more pending results, returning what we have 41684 1727204462.37552: results queue empty 41684 1727204462.37553: checking for any_errors_fatal 41684 1727204462.37560: done checking for any_errors_fatal 41684 1727204462.37561: checking for max_fail_percentage 41684 1727204462.37569: done checking for max_fail_percentage 41684 1727204462.37570: checking to see if all hosts have failed and the running result is not ok 41684 1727204462.37571: done checking to see if all hosts have failed 41684 1727204462.37572: getting the remaining hosts for this loop 41684 1727204462.37573: done getting the remaining hosts for this loop 41684 1727204462.37577: getting the next task for host managed-node1 41684 1727204462.37583: done getting next task for host managed-node1 41684 1727204462.37588: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 41684 1727204462.37590: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204462.37604: getting variables 41684 1727204462.37606: in VariableManager get_vars() 41684 1727204462.37646: Calling all_inventory to load vars for managed-node1 41684 1727204462.37649: Calling groups_inventory to load vars for managed-node1 41684 1727204462.37650: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204462.37659: Calling all_plugins_play to load vars for managed-node1 41684 1727204462.37665: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204462.37669: Calling groups_plugins_play to load vars for managed-node1 41684 1727204462.38983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204462.41743: done with get_vars() 41684 1727204462.41785: done getting variables 41684 1727204462.41850: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:01:02 -0400 (0:00:00.120) 0:00:18.820 ***** 41684 1727204462.41887: entering _queue_task() for managed-node1/package 41684 1727204462.42215: worker is 1 (out of 1 available) 41684 1727204462.42228: exiting _queue_task() for managed-node1/package 41684 1727204462.42242: done queuing things up, now waiting for results queue to drain 41684 1727204462.42243: waiting for pending results... 41684 1727204462.42529: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 41684 1727204462.42684: in run() - task 0affcd87-79f5-3839-086d-000000000023 41684 1727204462.42707: variable 'ansible_search_path' from source: unknown 41684 1727204462.42718: variable 'ansible_search_path' from source: unknown 41684 1727204462.42756: calling self._execute() 41684 1727204462.42852: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204462.42865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204462.42879: variable 'omit' from source: magic vars 41684 1727204462.43484: variable 'ansible_distribution_major_version' from source: facts 41684 1727204462.43501: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204462.43709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204462.44013: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204462.44062: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204462.44150: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204462.44192: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204462.44320: variable 'network_packages' from source: role '' defaults 41684 1727204462.44442: variable '__network_provider_setup' from source: role '' defaults 41684 1727204462.44460: variable '__network_service_name_default_nm' from source: role '' defaults 41684 1727204462.44539: variable '__network_service_name_default_nm' from source: role '' defaults 41684 1727204462.44557: variable '__network_packages_default_nm' from source: role '' defaults 41684 1727204462.44625: variable '__network_packages_default_nm' from source: role '' defaults 41684 1727204462.44848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204462.47342: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204462.47424: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204462.47466: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204462.47511: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204462.47541: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204462.47651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204462.47689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204462.47726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.47773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204462.47793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204462.47851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204462.47885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204462.47924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.47975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204462.47995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204462.48268: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41684 1727204462.48404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204462.48434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204462.48478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.48527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204462.48547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204462.48657: variable 'ansible_python' from source: facts 41684 1727204462.48700: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41684 1727204462.48803: variable '__network_wpa_supplicant_required' from source: role '' defaults 41684 1727204462.48905: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41684 1727204462.49035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204462.49061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204462.49092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.49140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204462.49158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204462.49207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204462.49249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204462.49278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.49319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204462.49343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204462.49488: variable 'network_connections' from source: task vars 41684 1727204462.49500: variable 'interface0' from source: play vars 41684 1727204462.49602: variable 'interface0' from source: play vars 41684 1727204462.49616: variable 'interface0' from source: play vars 41684 1727204462.49718: variable 'interface0' from source: play vars 41684 1727204462.49737: variable 'interface1' from source: play vars 41684 1727204462.49837: variable 'interface1' from source: play vars 41684 1727204462.49853: variable 'interface1' from source: play vars 41684 1727204462.49958: variable 'interface1' from source: play vars 41684 1727204462.50042: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204462.50075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204462.50117: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.50150: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204462.50206: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204462.50478: variable 'network_connections' from source: task vars 41684 1727204462.50487: variable 'interface0' from source: play vars 41684 1727204462.50589: variable 'interface0' from source: play vars 41684 1727204462.50602: variable 'interface0' from source: play vars 41684 1727204462.50703: variable 'interface0' from source: play vars 41684 1727204462.50719: variable 'interface1' from source: play vars 41684 1727204462.50821: variable 'interface1' from source: play vars 41684 1727204462.50835: variable 'interface1' from source: play vars 41684 1727204462.50942: variable 'interface1' from source: play vars 41684 1727204462.51013: variable '__network_packages_default_wireless' from source: role '' defaults 41684 1727204462.51103: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204462.51443: variable 'network_connections' from source: task vars 41684 1727204462.51453: variable 'interface0' from source: play vars 41684 1727204462.51533: variable 'interface0' from source: play vars 41684 1727204462.51544: variable 'interface0' from source: play vars 41684 1727204462.51623: variable 'interface0' from source: play vars 41684 1727204462.51647: variable 'interface1' from source: play vars 41684 1727204462.51718: variable 'interface1' from source: play vars 41684 1727204462.51729: variable 'interface1' from source: play vars 41684 1727204462.51795: variable 'interface1' from source: play vars 41684 1727204462.51833: variable '__network_packages_default_team' from source: role '' defaults 41684 1727204462.51929: variable '__network_team_connections_defined' from source: role '' defaults 41684 1727204462.52713: variable 'network_connections' from source: task vars 41684 1727204462.52727: variable 'interface0' from source: play vars 41684 1727204462.52796: variable 'interface0' from source: play vars 41684 1727204462.52923: variable 'interface0' from source: play vars 41684 1727204462.53000: variable 'interface0' from source: play vars 41684 1727204462.53060: variable 'interface1' from source: play vars 41684 1727204462.53128: variable 'interface1' from source: play vars 41684 1727204462.53799: variable 'interface1' from source: play vars 41684 1727204462.53868: variable 'interface1' from source: play vars 41684 1727204462.54061: variable '__network_service_name_default_initscripts' from source: role '' defaults 41684 1727204462.54190: variable '__network_service_name_default_initscripts' from source: role '' defaults 41684 1727204462.54919: variable '__network_packages_default_initscripts' from source: role '' defaults 41684 1727204462.54990: variable '__network_packages_default_initscripts' from source: role '' defaults 41684 1727204462.55493: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41684 1727204462.56273: variable 'network_connections' from source: task vars 41684 1727204462.56279: variable 'interface0' from source: play vars 41684 1727204462.56344: variable 'interface0' from source: play vars 41684 1727204462.56351: variable 'interface0' from source: play vars 41684 1727204462.56420: variable 'interface0' from source: play vars 41684 1727204462.56431: variable 'interface1' from source: play vars 41684 1727204462.56495: variable 'interface1' from source: play vars 41684 1727204462.56501: variable 'interface1' from source: play vars 41684 1727204462.56563: variable 'interface1' from source: play vars 41684 1727204462.56580: variable 'ansible_distribution' from source: facts 41684 1727204462.56583: variable '__network_rh_distros' from source: role '' defaults 41684 1727204462.56589: variable 'ansible_distribution_major_version' from source: facts 41684 1727204462.56619: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41684 1727204462.56794: variable 'ansible_distribution' from source: facts 41684 1727204462.56798: variable '__network_rh_distros' from source: role '' defaults 41684 1727204462.56803: variable 'ansible_distribution_major_version' from source: facts 41684 1727204462.56816: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41684 1727204462.56993: variable 'ansible_distribution' from source: facts 41684 1727204462.56997: variable '__network_rh_distros' from source: role '' defaults 41684 1727204462.57001: variable 'ansible_distribution_major_version' from source: facts 41684 1727204462.57038: variable 'network_provider' from source: set_fact 41684 1727204462.57057: variable 'ansible_facts' from source: unknown 41684 1727204462.57838: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 41684 1727204462.57842: when evaluation is False, skipping this task 41684 1727204462.57844: _execute() done 41684 1727204462.57847: dumping result to json 41684 1727204462.57849: done dumping result, returning 41684 1727204462.57858: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-3839-086d-000000000023] 41684 1727204462.57863: sending task result for task 0affcd87-79f5-3839-086d-000000000023 41684 1727204462.57962: done sending task result for task 0affcd87-79f5-3839-086d-000000000023 41684 1727204462.57967: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 41684 1727204462.58013: no more pending results, returning what we have 41684 1727204462.58017: results queue empty 41684 1727204462.58023: checking for any_errors_fatal 41684 1727204462.58031: done checking for any_errors_fatal 41684 1727204462.58031: checking for max_fail_percentage 41684 1727204462.58033: done checking for max_fail_percentage 41684 1727204462.58034: checking to see if all hosts have failed and the running result is not ok 41684 1727204462.58035: done checking to see if all hosts have failed 41684 1727204462.58035: getting the remaining hosts for this loop 41684 1727204462.58037: done getting the remaining hosts for this loop 41684 1727204462.58042: getting the next task for host managed-node1 41684 1727204462.58048: done getting next task for host managed-node1 41684 1727204462.58053: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41684 1727204462.58055: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204462.58070: getting variables 41684 1727204462.58072: in VariableManager get_vars() 41684 1727204462.58114: Calling all_inventory to load vars for managed-node1 41684 1727204462.58117: Calling groups_inventory to load vars for managed-node1 41684 1727204462.58119: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204462.58128: Calling all_plugins_play to load vars for managed-node1 41684 1727204462.58130: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204462.58132: Calling groups_plugins_play to load vars for managed-node1 41684 1727204462.59485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204462.61208: done with get_vars() 41684 1727204462.61237: done getting variables 41684 1727204462.61306: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:01:02 -0400 (0:00:00.194) 0:00:19.014 ***** 41684 1727204462.61341: entering _queue_task() for managed-node1/package 41684 1727204462.61670: worker is 1 (out of 1 available) 41684 1727204462.61682: exiting _queue_task() for managed-node1/package 41684 1727204462.61696: done queuing things up, now waiting for results queue to drain 41684 1727204462.61698: waiting for pending results... 41684 1727204462.61984: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41684 1727204462.62311: in run() - task 0affcd87-79f5-3839-086d-000000000024 41684 1727204462.62331: variable 'ansible_search_path' from source: unknown 41684 1727204462.62339: variable 'ansible_search_path' from source: unknown 41684 1727204462.62383: calling self._execute() 41684 1727204462.62477: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204462.62487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204462.62500: variable 'omit' from source: magic vars 41684 1727204462.62929: variable 'ansible_distribution_major_version' from source: facts 41684 1727204462.62949: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204462.63081: variable 'network_state' from source: role '' defaults 41684 1727204462.63096: Evaluated conditional (network_state != {}): False 41684 1727204462.63103: when evaluation is False, skipping this task 41684 1727204462.63109: _execute() done 41684 1727204462.63115: dumping result to json 41684 1727204462.63121: done dumping result, returning 41684 1727204462.63132: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-3839-086d-000000000024] 41684 1727204462.63146: sending task result for task 0affcd87-79f5-3839-086d-000000000024 41684 1727204462.63272: done sending task result for task 0affcd87-79f5-3839-086d-000000000024 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41684 1727204462.63323: no more pending results, returning what we have 41684 1727204462.63327: results queue empty 41684 1727204462.63328: checking for any_errors_fatal 41684 1727204462.63335: done checking for any_errors_fatal 41684 1727204462.63335: checking for max_fail_percentage 41684 1727204462.63337: done checking for max_fail_percentage 41684 1727204462.63339: checking to see if all hosts have failed and the running result is not ok 41684 1727204462.63339: done checking to see if all hosts have failed 41684 1727204462.63340: getting the remaining hosts for this loop 41684 1727204462.63342: done getting the remaining hosts for this loop 41684 1727204462.63346: getting the next task for host managed-node1 41684 1727204462.63353: done getting next task for host managed-node1 41684 1727204462.63357: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41684 1727204462.63361: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204462.63378: getting variables 41684 1727204462.63380: in VariableManager get_vars() 41684 1727204462.63425: Calling all_inventory to load vars for managed-node1 41684 1727204462.63428: Calling groups_inventory to load vars for managed-node1 41684 1727204462.63430: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204462.63443: Calling all_plugins_play to load vars for managed-node1 41684 1727204462.63446: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204462.63448: Calling groups_plugins_play to load vars for managed-node1 41684 1727204462.64527: WORKER PROCESS EXITING 41684 1727204462.65418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204462.67591: done with get_vars() 41684 1727204462.67622: done getting variables 41684 1727204462.67693: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:01:02 -0400 (0:00:00.063) 0:00:19.078 ***** 41684 1727204462.67726: entering _queue_task() for managed-node1/package 41684 1727204462.68037: worker is 1 (out of 1 available) 41684 1727204462.68050: exiting _queue_task() for managed-node1/package 41684 1727204462.68065: done queuing things up, now waiting for results queue to drain 41684 1727204462.68068: waiting for pending results... 41684 1727204462.68353: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41684 1727204462.68499: in run() - task 0affcd87-79f5-3839-086d-000000000025 41684 1727204462.68522: variable 'ansible_search_path' from source: unknown 41684 1727204462.68530: variable 'ansible_search_path' from source: unknown 41684 1727204462.68570: calling self._execute() 41684 1727204462.68667: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204462.68735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204462.68748: variable 'omit' from source: magic vars 41684 1727204462.69544: variable 'ansible_distribution_major_version' from source: facts 41684 1727204462.69626: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204462.69855: variable 'network_state' from source: role '' defaults 41684 1727204462.69949: Evaluated conditional (network_state != {}): False 41684 1727204462.69958: when evaluation is False, skipping this task 41684 1727204462.69967: _execute() done 41684 1727204462.69974: dumping result to json 41684 1727204462.69981: done dumping result, returning 41684 1727204462.69991: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-3839-086d-000000000025] 41684 1727204462.70002: sending task result for task 0affcd87-79f5-3839-086d-000000000025 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41684 1727204462.70155: no more pending results, returning what we have 41684 1727204462.70159: results queue empty 41684 1727204462.70160: checking for any_errors_fatal 41684 1727204462.70171: done checking for any_errors_fatal 41684 1727204462.70172: checking for max_fail_percentage 41684 1727204462.70174: done checking for max_fail_percentage 41684 1727204462.70175: checking to see if all hosts have failed and the running result is not ok 41684 1727204462.70176: done checking to see if all hosts have failed 41684 1727204462.70177: getting the remaining hosts for this loop 41684 1727204462.70179: done getting the remaining hosts for this loop 41684 1727204462.70183: getting the next task for host managed-node1 41684 1727204462.70191: done getting next task for host managed-node1 41684 1727204462.70195: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41684 1727204462.70198: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204462.70214: getting variables 41684 1727204462.70216: in VariableManager get_vars() 41684 1727204462.70260: Calling all_inventory to load vars for managed-node1 41684 1727204462.70265: Calling groups_inventory to load vars for managed-node1 41684 1727204462.70268: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204462.70281: Calling all_plugins_play to load vars for managed-node1 41684 1727204462.70284: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204462.70287: Calling groups_plugins_play to load vars for managed-node1 41684 1727204462.71598: done sending task result for task 0affcd87-79f5-3839-086d-000000000025 41684 1727204462.71602: WORKER PROCESS EXITING 41684 1727204462.74134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204462.77451: done with get_vars() 41684 1727204462.77486: done getting variables 41684 1727204462.77590: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:01:02 -0400 (0:00:00.098) 0:00:19.177 ***** 41684 1727204462.77624: entering _queue_task() for managed-node1/service 41684 1727204462.77626: Creating lock for service 41684 1727204462.77961: worker is 1 (out of 1 available) 41684 1727204462.78677: exiting _queue_task() for managed-node1/service 41684 1727204462.78690: done queuing things up, now waiting for results queue to drain 41684 1727204462.78691: waiting for pending results... 41684 1727204462.79469: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41684 1727204462.79615: in run() - task 0affcd87-79f5-3839-086d-000000000026 41684 1727204462.79638: variable 'ansible_search_path' from source: unknown 41684 1727204462.79648: variable 'ansible_search_path' from source: unknown 41684 1727204462.79700: calling self._execute() 41684 1727204462.79800: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204462.79813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204462.79828: variable 'omit' from source: magic vars 41684 1727204462.80221: variable 'ansible_distribution_major_version' from source: facts 41684 1727204462.80885: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204462.81024: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204462.81238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204462.86597: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204462.86684: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204462.86916: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204462.86953: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204462.86986: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204462.87065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204462.87805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204462.87840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.87891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204462.87911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204462.87967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204462.87997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204462.88027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.88078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204462.88099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204462.88143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204462.88177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204462.88206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.88249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204462.88274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204462.88460: variable 'network_connections' from source: task vars 41684 1727204462.88486: variable 'interface0' from source: play vars 41684 1727204462.88573: variable 'interface0' from source: play vars 41684 1727204462.89281: variable 'interface0' from source: play vars 41684 1727204462.89355: variable 'interface0' from source: play vars 41684 1727204462.89378: variable 'interface1' from source: play vars 41684 1727204462.89446: variable 'interface1' from source: play vars 41684 1727204462.89458: variable 'interface1' from source: play vars 41684 1727204462.89525: variable 'interface1' from source: play vars 41684 1727204462.89608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204462.89785: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204462.89838: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204462.89874: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204462.89908: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204462.89954: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204462.89984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204462.90010: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204462.90701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204462.90784: variable '__network_team_connections_defined' from source: role '' defaults 41684 1727204462.91046: variable 'network_connections' from source: task vars 41684 1727204462.91056: variable 'interface0' from source: play vars 41684 1727204462.91127: variable 'interface0' from source: play vars 41684 1727204462.91138: variable 'interface0' from source: play vars 41684 1727204462.91205: variable 'interface0' from source: play vars 41684 1727204462.91221: variable 'interface1' from source: play vars 41684 1727204462.91289: variable 'interface1' from source: play vars 41684 1727204462.91309: variable 'interface1' from source: play vars 41684 1727204462.91376: variable 'interface1' from source: play vars 41684 1727204462.92110: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41684 1727204462.92118: when evaluation is False, skipping this task 41684 1727204462.92125: _execute() done 41684 1727204462.92132: dumping result to json 41684 1727204462.92139: done dumping result, returning 41684 1727204462.92149: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-3839-086d-000000000026] 41684 1727204462.92158: sending task result for task 0affcd87-79f5-3839-086d-000000000026 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41684 1727204462.92313: no more pending results, returning what we have 41684 1727204462.92317: results queue empty 41684 1727204462.92317: checking for any_errors_fatal 41684 1727204462.92325: done checking for any_errors_fatal 41684 1727204462.92325: checking for max_fail_percentage 41684 1727204462.92327: done checking for max_fail_percentage 41684 1727204462.92328: checking to see if all hosts have failed and the running result is not ok 41684 1727204462.92329: done checking to see if all hosts have failed 41684 1727204462.92329: getting the remaining hosts for this loop 41684 1727204462.92331: done getting the remaining hosts for this loop 41684 1727204462.92335: getting the next task for host managed-node1 41684 1727204462.92341: done getting next task for host managed-node1 41684 1727204462.92345: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41684 1727204462.92348: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204462.92362: getting variables 41684 1727204462.92366: in VariableManager get_vars() 41684 1727204462.92407: Calling all_inventory to load vars for managed-node1 41684 1727204462.92410: Calling groups_inventory to load vars for managed-node1 41684 1727204462.92412: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204462.92423: Calling all_plugins_play to load vars for managed-node1 41684 1727204462.92425: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204462.92427: Calling groups_plugins_play to load vars for managed-node1 41684 1727204462.93789: done sending task result for task 0affcd87-79f5-3839-086d-000000000026 41684 1727204462.93793: WORKER PROCESS EXITING 41684 1727204462.94428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204462.96686: done with get_vars() 41684 1727204462.96720: done getting variables 41684 1727204462.96794: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:01:02 -0400 (0:00:00.192) 0:00:19.369 ***** 41684 1727204462.96827: entering _queue_task() for managed-node1/service 41684 1727204462.97180: worker is 1 (out of 1 available) 41684 1727204462.97194: exiting _queue_task() for managed-node1/service 41684 1727204462.97208: done queuing things up, now waiting for results queue to drain 41684 1727204462.97214: waiting for pending results... 41684 1727204462.97510: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41684 1727204462.97671: in run() - task 0affcd87-79f5-3839-086d-000000000027 41684 1727204462.97695: variable 'ansible_search_path' from source: unknown 41684 1727204462.97705: variable 'ansible_search_path' from source: unknown 41684 1727204462.97747: calling self._execute() 41684 1727204462.97851: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204462.97866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204462.97888: variable 'omit' from source: magic vars 41684 1727204462.98281: variable 'ansible_distribution_major_version' from source: facts 41684 1727204462.98299: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204462.98482: variable 'network_provider' from source: set_fact 41684 1727204462.98493: variable 'network_state' from source: role '' defaults 41684 1727204462.98507: Evaluated conditional (network_provider == "nm" or network_state != {}): True 41684 1727204462.98518: variable 'omit' from source: magic vars 41684 1727204462.98584: variable 'omit' from source: magic vars 41684 1727204462.98617: variable 'network_service_name' from source: role '' defaults 41684 1727204462.98699: variable 'network_service_name' from source: role '' defaults 41684 1727204462.98817: variable '__network_provider_setup' from source: role '' defaults 41684 1727204462.98827: variable '__network_service_name_default_nm' from source: role '' defaults 41684 1727204462.98900: variable '__network_service_name_default_nm' from source: role '' defaults 41684 1727204462.98914: variable '__network_packages_default_nm' from source: role '' defaults 41684 1727204462.98987: variable '__network_packages_default_nm' from source: role '' defaults 41684 1727204462.99228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204463.01652: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204463.01737: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204463.01777: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204463.01823: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204463.01854: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204463.01943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204463.01979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204463.02017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204463.02063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204463.02085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204463.02140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204463.02168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204463.02196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204463.02245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204463.02265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204463.02516: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41684 1727204463.02648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204463.02686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204463.02715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204463.02761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204463.02790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204463.02893: variable 'ansible_python' from source: facts 41684 1727204463.02921: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41684 1727204463.03015: variable '__network_wpa_supplicant_required' from source: role '' defaults 41684 1727204463.03105: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41684 1727204463.03237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204463.03267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204463.03295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204463.03345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204463.03366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204463.03417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204463.03461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204463.03492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204463.03538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204463.03560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204463.03707: variable 'network_connections' from source: task vars 41684 1727204463.03719: variable 'interface0' from source: play vars 41684 1727204463.03801: variable 'interface0' from source: play vars 41684 1727204463.03816: variable 'interface0' from source: play vars 41684 1727204463.03896: variable 'interface0' from source: play vars 41684 1727204463.03928: variable 'interface1' from source: play vars 41684 1727204463.04010: variable 'interface1' from source: play vars 41684 1727204463.04026: variable 'interface1' from source: play vars 41684 1727204463.04108: variable 'interface1' from source: play vars 41684 1727204463.04239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204463.04453: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204463.04507: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204463.04558: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204463.04603: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204463.04674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204463.04707: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204463.04748: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204463.04787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204463.04838: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204463.05142: variable 'network_connections' from source: task vars 41684 1727204463.05153: variable 'interface0' from source: play vars 41684 1727204463.05241: variable 'interface0' from source: play vars 41684 1727204463.05257: variable 'interface0' from source: play vars 41684 1727204463.05339: variable 'interface0' from source: play vars 41684 1727204463.05375: variable 'interface1' from source: play vars 41684 1727204463.05455: variable 'interface1' from source: play vars 41684 1727204463.05473: variable 'interface1' from source: play vars 41684 1727204463.05551: variable 'interface1' from source: play vars 41684 1727204463.05627: variable '__network_packages_default_wireless' from source: role '' defaults 41684 1727204463.05711: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204463.06013: variable 'network_connections' from source: task vars 41684 1727204463.06022: variable 'interface0' from source: play vars 41684 1727204463.06099: variable 'interface0' from source: play vars 41684 1727204463.06110: variable 'interface0' from source: play vars 41684 1727204463.06186: variable 'interface0' from source: play vars 41684 1727204463.06202: variable 'interface1' from source: play vars 41684 1727204463.06277: variable 'interface1' from source: play vars 41684 1727204463.06288: variable 'interface1' from source: play vars 41684 1727204463.06354: variable 'interface1' from source: play vars 41684 1727204463.06397: variable '__network_packages_default_team' from source: role '' defaults 41684 1727204463.06481: variable '__network_team_connections_defined' from source: role '' defaults 41684 1727204463.06791: variable 'network_connections' from source: task vars 41684 1727204463.06801: variable 'interface0' from source: play vars 41684 1727204463.06881: variable 'interface0' from source: play vars 41684 1727204463.06893: variable 'interface0' from source: play vars 41684 1727204463.06971: variable 'interface0' from source: play vars 41684 1727204463.06988: variable 'interface1' from source: play vars 41684 1727204463.07067: variable 'interface1' from source: play vars 41684 1727204463.07078: variable 'interface1' from source: play vars 41684 1727204463.07153: variable 'interface1' from source: play vars 41684 1727204463.07225: variable '__network_service_name_default_initscripts' from source: role '' defaults 41684 1727204463.07295: variable '__network_service_name_default_initscripts' from source: role '' defaults 41684 1727204463.07306: variable '__network_packages_default_initscripts' from source: role '' defaults 41684 1727204463.07372: variable '__network_packages_default_initscripts' from source: role '' defaults 41684 1727204463.07804: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41684 1727204463.09024: variable 'network_connections' from source: task vars 41684 1727204463.09037: variable 'interface0' from source: play vars 41684 1727204463.09182: variable 'interface0' from source: play vars 41684 1727204463.09196: variable 'interface0' from source: play vars 41684 1727204463.09266: variable 'interface0' from source: play vars 41684 1727204463.09447: variable 'interface1' from source: play vars 41684 1727204463.09512: variable 'interface1' from source: play vars 41684 1727204463.09523: variable 'interface1' from source: play vars 41684 1727204463.09592: variable 'interface1' from source: play vars 41684 1727204463.09667: variable 'ansible_distribution' from source: facts 41684 1727204463.09768: variable '__network_rh_distros' from source: role '' defaults 41684 1727204463.09780: variable 'ansible_distribution_major_version' from source: facts 41684 1727204463.09813: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41684 1727204463.10234: variable 'ansible_distribution' from source: facts 41684 1727204463.10243: variable '__network_rh_distros' from source: role '' defaults 41684 1727204463.10253: variable 'ansible_distribution_major_version' from source: facts 41684 1727204463.10274: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41684 1727204463.10575: variable 'ansible_distribution' from source: facts 41684 1727204463.10639: variable '__network_rh_distros' from source: role '' defaults 41684 1727204463.10649: variable 'ansible_distribution_major_version' from source: facts 41684 1727204463.10785: variable 'network_provider' from source: set_fact 41684 1727204463.10814: variable 'omit' from source: magic vars 41684 1727204463.10963: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204463.10999: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204463.11025: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204463.11046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204463.11071: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204463.11107: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204463.11187: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204463.11196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204463.11416: Set connection var ansible_connection to ssh 41684 1727204463.11428: Set connection var ansible_pipelining to False 41684 1727204463.11439: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204463.11450: Set connection var ansible_timeout to 10 41684 1727204463.11462: Set connection var ansible_shell_executable to /bin/sh 41684 1727204463.11472: Set connection var ansible_shell_type to sh 41684 1727204463.11532: variable 'ansible_shell_executable' from source: unknown 41684 1727204463.11618: variable 'ansible_connection' from source: unknown 41684 1727204463.11625: variable 'ansible_module_compression' from source: unknown 41684 1727204463.11630: variable 'ansible_shell_type' from source: unknown 41684 1727204463.11635: variable 'ansible_shell_executable' from source: unknown 41684 1727204463.11641: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204463.11647: variable 'ansible_pipelining' from source: unknown 41684 1727204463.11652: variable 'ansible_timeout' from source: unknown 41684 1727204463.11658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204463.11872: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204463.11886: variable 'omit' from source: magic vars 41684 1727204463.11894: starting attempt loop 41684 1727204463.11900: running the handler 41684 1727204463.11998: variable 'ansible_facts' from source: unknown 41684 1727204463.13745: _low_level_execute_command(): starting 41684 1727204463.13759: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204463.15595: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204463.15615: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204463.15640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204463.15661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204463.15709: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204463.15721: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204463.15742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204463.15761: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204463.15775: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204463.15785: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204463.15795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204463.15807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204463.15821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204463.15831: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204463.15841: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204463.15859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204463.15969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204463.16095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204463.16185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204463.16765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204463.18431: stdout chunk (state=3): >>>/root <<< 41684 1727204463.18530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204463.18620: stderr chunk (state=3): >>><<< 41684 1727204463.18623: stdout chunk (state=3): >>><<< 41684 1727204463.18744: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204463.18748: _low_level_execute_command(): starting 41684 1727204463.18751: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204463.1864605-43383-230519149052849 `" && echo ansible-tmp-1727204463.1864605-43383-230519149052849="` echo /root/.ansible/tmp/ansible-tmp-1727204463.1864605-43383-230519149052849 `" ) && sleep 0' 41684 1727204463.20368: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204463.20372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204463.20406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204463.20410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204463.20413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204463.20483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204463.21007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204463.21070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204463.22940: stdout chunk (state=3): >>>ansible-tmp-1727204463.1864605-43383-230519149052849=/root/.ansible/tmp/ansible-tmp-1727204463.1864605-43383-230519149052849 <<< 41684 1727204463.23122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204463.23127: stdout chunk (state=3): >>><<< 41684 1727204463.23133: stderr chunk (state=3): >>><<< 41684 1727204463.23153: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204463.1864605-43383-230519149052849=/root/.ansible/tmp/ansible-tmp-1727204463.1864605-43383-230519149052849 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204463.23192: variable 'ansible_module_compression' from source: unknown 41684 1727204463.23253: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 41684 1727204463.23258: ANSIBALLZ: Acquiring lock 41684 1727204463.23261: ANSIBALLZ: Lock acquired: 139842516808240 41684 1727204463.23263: ANSIBALLZ: Creating module 41684 1727204463.67407: ANSIBALLZ: Writing module into payload 41684 1727204463.67624: ANSIBALLZ: Writing module 41684 1727204463.67660: ANSIBALLZ: Renaming module 41684 1727204463.67671: ANSIBALLZ: Done creating module 41684 1727204463.67713: variable 'ansible_facts' from source: unknown 41684 1727204463.68269: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204463.1864605-43383-230519149052849/AnsiballZ_systemd.py 41684 1727204463.68272: Sending initial data 41684 1727204463.68275: Sent initial data (156 bytes) 41684 1727204463.69101: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204463.69116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204463.69125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204463.69139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204463.69184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204463.69191: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204463.69201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204463.69220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204463.69252: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204463.69255: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204463.69271: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204463.69281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204463.69293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204463.69300: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204463.69306: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204463.69316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204463.69395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204463.69411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204463.69414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204463.69504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204463.71262: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204463.71316: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204463.71369: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmp8wclp_2l /root/.ansible/tmp/ansible-tmp-1727204463.1864605-43383-230519149052849/AnsiballZ_systemd.py <<< 41684 1727204463.71430: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204463.74274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204463.74343: stderr chunk (state=3): >>><<< 41684 1727204463.74347: stdout chunk (state=3): >>><<< 41684 1727204463.74369: done transferring module to remote 41684 1727204463.74379: _low_level_execute_command(): starting 41684 1727204463.74384: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204463.1864605-43383-230519149052849/ /root/.ansible/tmp/ansible-tmp-1727204463.1864605-43383-230519149052849/AnsiballZ_systemd.py && sleep 0' 41684 1727204463.76102: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204463.76106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204463.76153: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204463.76157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204463.76176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204463.76182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204463.76258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204463.76275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204463.76286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204463.77052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204463.78860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204463.78873: stderr chunk (state=3): >>><<< 41684 1727204463.78876: stdout chunk (state=3): >>><<< 41684 1727204463.78889: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204463.78892: _low_level_execute_command(): starting 41684 1727204463.78898: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204463.1864605-43383-230519149052849/AnsiballZ_systemd.py && sleep 0' 41684 1727204463.80570: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204463.80577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204463.80671: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204463.80677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204463.80749: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204463.80755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204463.80772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204463.80843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204463.80974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204463.80978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204463.81069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204464.06161: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ExecMainStartTimestampMonotonic": "28837083", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "14225408", "MemoryAvailable": "infinity", "CPUUsageNSec": "1396831000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:21 EDT", "StateChangeTimestampMonotonic": "324827295", "InactiveExitTimestamp": "Tue 2024-09-24 14:49:25 EDT", "InactiveExitTimestampMonotonic": "28837278", "ActiveEnterTimestamp": "Tue 2024-09-24 14:49:27 EDT", "ActiveEnterTimestampMonotonic": "30313565", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ConditionTimestampMonotonic": "28833288", "AssertTimestamp": "Tue 2024-09-24 14:49:25 EDT", "AssertTimestampMonotonic": "28833291", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a065c0d4382c4b51bfc5a74ffa3d403d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 41684 1727204464.07631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204464.07635: stdout chunk (state=3): >>><<< 41684 1727204464.07637: stderr chunk (state=3): >>><<< 41684 1727204464.07774: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ExecMainStartTimestampMonotonic": "28837083", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "14225408", "MemoryAvailable": "infinity", "CPUUsageNSec": "1396831000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:21 EDT", "StateChangeTimestampMonotonic": "324827295", "InactiveExitTimestamp": "Tue 2024-09-24 14:49:25 EDT", "InactiveExitTimestampMonotonic": "28837278", "ActiveEnterTimestamp": "Tue 2024-09-24 14:49:27 EDT", "ActiveEnterTimestampMonotonic": "30313565", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ConditionTimestampMonotonic": "28833288", "AssertTimestamp": "Tue 2024-09-24 14:49:25 EDT", "AssertTimestampMonotonic": "28833291", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a065c0d4382c4b51bfc5a74ffa3d403d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204464.07867: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204463.1864605-43383-230519149052849/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204464.07898: _low_level_execute_command(): starting 41684 1727204464.07908: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204463.1864605-43383-230519149052849/ > /dev/null 2>&1 && sleep 0' 41684 1727204464.08600: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204464.08615: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204464.08631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204464.08656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204464.08703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204464.08715: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204464.08729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204464.08753: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204464.08767: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204464.08779: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204464.08807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204464.08831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204464.08848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204464.08867: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204464.09011: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204464.09040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204464.09144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204464.09168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204464.09185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204464.09270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204464.11105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204464.11181: stderr chunk (state=3): >>><<< 41684 1727204464.11185: stdout chunk (state=3): >>><<< 41684 1727204464.11474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204464.11477: handler run complete 41684 1727204464.11480: attempt loop complete, returning result 41684 1727204464.11482: _execute() done 41684 1727204464.11484: dumping result to json 41684 1727204464.11486: done dumping result, returning 41684 1727204464.11488: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-3839-086d-000000000027] 41684 1727204464.11490: sending task result for task 0affcd87-79f5-3839-086d-000000000027 41684 1727204464.11633: done sending task result for task 0affcd87-79f5-3839-086d-000000000027 41684 1727204464.11636: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41684 1727204464.11707: no more pending results, returning what we have 41684 1727204464.11711: results queue empty 41684 1727204464.11712: checking for any_errors_fatal 41684 1727204464.11721: done checking for any_errors_fatal 41684 1727204464.11722: checking for max_fail_percentage 41684 1727204464.11724: done checking for max_fail_percentage 41684 1727204464.11725: checking to see if all hosts have failed and the running result is not ok 41684 1727204464.11726: done checking to see if all hosts have failed 41684 1727204464.11727: getting the remaining hosts for this loop 41684 1727204464.11729: done getting the remaining hosts for this loop 41684 1727204464.11733: getting the next task for host managed-node1 41684 1727204464.11740: done getting next task for host managed-node1 41684 1727204464.11745: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41684 1727204464.11748: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204464.11758: getting variables 41684 1727204464.11761: in VariableManager get_vars() 41684 1727204464.11805: Calling all_inventory to load vars for managed-node1 41684 1727204464.11808: Calling groups_inventory to load vars for managed-node1 41684 1727204464.11811: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204464.11822: Calling all_plugins_play to load vars for managed-node1 41684 1727204464.11825: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204464.11828: Calling groups_plugins_play to load vars for managed-node1 41684 1727204464.14898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204464.18654: done with get_vars() 41684 1727204464.18692: done getting variables 41684 1727204464.18871: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:01:04 -0400 (0:00:01.220) 0:00:20.590 ***** 41684 1727204464.18909: entering _queue_task() for managed-node1/service 41684 1727204464.19679: worker is 1 (out of 1 available) 41684 1727204464.19693: exiting _queue_task() for managed-node1/service 41684 1727204464.19707: done queuing things up, now waiting for results queue to drain 41684 1727204464.19709: waiting for pending results... 41684 1727204464.20240: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41684 1727204464.21011: in run() - task 0affcd87-79f5-3839-086d-000000000028 41684 1727204464.21033: variable 'ansible_search_path' from source: unknown 41684 1727204464.21040: variable 'ansible_search_path' from source: unknown 41684 1727204464.21087: calling self._execute() 41684 1727204464.21182: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204464.21193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204464.21208: variable 'omit' from source: magic vars 41684 1727204464.21568: variable 'ansible_distribution_major_version' from source: facts 41684 1727204464.22384: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204464.22510: variable 'network_provider' from source: set_fact 41684 1727204464.22521: Evaluated conditional (network_provider == "nm"): True 41684 1727204464.22626: variable '__network_wpa_supplicant_required' from source: role '' defaults 41684 1727204464.22722: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41684 1727204464.22901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204464.27544: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204464.27622: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204464.27671: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204464.27711: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204464.27743: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204464.27842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204464.27878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204464.27910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204464.28611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204464.28632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204464.28688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204464.28716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204464.28746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204464.28795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204464.28814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204464.28861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204464.28894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204464.28924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204464.28973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204464.28993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204464.29143: variable 'network_connections' from source: task vars 41684 1727204464.29167: variable 'interface0' from source: play vars 41684 1727204464.29254: variable 'interface0' from source: play vars 41684 1727204464.29274: variable 'interface0' from source: play vars 41684 1727204464.29337: variable 'interface0' from source: play vars 41684 1727204464.29688: variable 'interface1' from source: play vars 41684 1727204464.29750: variable 'interface1' from source: play vars 41684 1727204464.29767: variable 'interface1' from source: play vars 41684 1727204464.29829: variable 'interface1' from source: play vars 41684 1727204464.29915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204464.30180: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204464.30252: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204464.30291: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204464.30325: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204464.30379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204464.30400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204464.30423: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204464.30456: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204464.30509: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204464.30777: variable 'network_connections' from source: task vars 41684 1727204464.30781: variable 'interface0' from source: play vars 41684 1727204464.30845: variable 'interface0' from source: play vars 41684 1727204464.30852: variable 'interface0' from source: play vars 41684 1727204464.30924: variable 'interface0' from source: play vars 41684 1727204464.30935: variable 'interface1' from source: play vars 41684 1727204464.31000: variable 'interface1' from source: play vars 41684 1727204464.31006: variable 'interface1' from source: play vars 41684 1727204464.31069: variable 'interface1' from source: play vars 41684 1727204464.31119: Evaluated conditional (__network_wpa_supplicant_required): False 41684 1727204464.31122: when evaluation is False, skipping this task 41684 1727204464.31125: _execute() done 41684 1727204464.31131: dumping result to json 41684 1727204464.31134: done dumping result, returning 41684 1727204464.31143: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-3839-086d-000000000028] 41684 1727204464.31148: sending task result for task 0affcd87-79f5-3839-086d-000000000028 41684 1727204464.31243: done sending task result for task 0affcd87-79f5-3839-086d-000000000028 41684 1727204464.31245: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 41684 1727204464.31294: no more pending results, returning what we have 41684 1727204464.31298: results queue empty 41684 1727204464.31299: checking for any_errors_fatal 41684 1727204464.31321: done checking for any_errors_fatal 41684 1727204464.31322: checking for max_fail_percentage 41684 1727204464.31324: done checking for max_fail_percentage 41684 1727204464.31324: checking to see if all hosts have failed and the running result is not ok 41684 1727204464.31325: done checking to see if all hosts have failed 41684 1727204464.31326: getting the remaining hosts for this loop 41684 1727204464.31327: done getting the remaining hosts for this loop 41684 1727204464.31331: getting the next task for host managed-node1 41684 1727204464.31338: done getting next task for host managed-node1 41684 1727204464.31342: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 41684 1727204464.31344: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204464.31359: getting variables 41684 1727204464.31360: in VariableManager get_vars() 41684 1727204464.31407: Calling all_inventory to load vars for managed-node1 41684 1727204464.31410: Calling groups_inventory to load vars for managed-node1 41684 1727204464.31413: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204464.31423: Calling all_plugins_play to load vars for managed-node1 41684 1727204464.31425: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204464.31428: Calling groups_plugins_play to load vars for managed-node1 41684 1727204464.33399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204464.35399: done with get_vars() 41684 1727204464.35432: done getting variables 41684 1727204464.35496: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:01:04 -0400 (0:00:00.166) 0:00:20.756 ***** 41684 1727204464.35531: entering _queue_task() for managed-node1/service 41684 1727204464.35873: worker is 1 (out of 1 available) 41684 1727204464.35887: exiting _queue_task() for managed-node1/service 41684 1727204464.35899: done queuing things up, now waiting for results queue to drain 41684 1727204464.35900: waiting for pending results... 41684 1727204464.36319: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 41684 1727204464.36470: in run() - task 0affcd87-79f5-3839-086d-000000000029 41684 1727204464.36497: variable 'ansible_search_path' from source: unknown 41684 1727204464.36505: variable 'ansible_search_path' from source: unknown 41684 1727204464.36547: calling self._execute() 41684 1727204464.36654: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204464.36671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204464.36687: variable 'omit' from source: magic vars 41684 1727204464.37301: variable 'ansible_distribution_major_version' from source: facts 41684 1727204464.37370: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204464.37605: variable 'network_provider' from source: set_fact 41684 1727204464.37616: Evaluated conditional (network_provider == "initscripts"): False 41684 1727204464.37622: when evaluation is False, skipping this task 41684 1727204464.37687: _execute() done 41684 1727204464.37694: dumping result to json 41684 1727204464.37700: done dumping result, returning 41684 1727204464.37709: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-3839-086d-000000000029] 41684 1727204464.37719: sending task result for task 0affcd87-79f5-3839-086d-000000000029 skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41684 1727204464.37876: no more pending results, returning what we have 41684 1727204464.37881: results queue empty 41684 1727204464.37882: checking for any_errors_fatal 41684 1727204464.37891: done checking for any_errors_fatal 41684 1727204464.37891: checking for max_fail_percentage 41684 1727204464.37893: done checking for max_fail_percentage 41684 1727204464.37894: checking to see if all hosts have failed and the running result is not ok 41684 1727204464.37894: done checking to see if all hosts have failed 41684 1727204464.37895: getting the remaining hosts for this loop 41684 1727204464.37897: done getting the remaining hosts for this loop 41684 1727204464.37901: getting the next task for host managed-node1 41684 1727204464.37908: done getting next task for host managed-node1 41684 1727204464.37913: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41684 1727204464.37916: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204464.37932: getting variables 41684 1727204464.37934: in VariableManager get_vars() 41684 1727204464.37983: Calling all_inventory to load vars for managed-node1 41684 1727204464.37986: Calling groups_inventory to load vars for managed-node1 41684 1727204464.37988: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204464.38000: Calling all_plugins_play to load vars for managed-node1 41684 1727204464.38002: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204464.38005: Calling groups_plugins_play to load vars for managed-node1 41684 1727204464.39582: done sending task result for task 0affcd87-79f5-3839-086d-000000000029 41684 1727204464.39586: WORKER PROCESS EXITING 41684 1727204464.40714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204464.43044: done with get_vars() 41684 1727204464.43075: done getting variables 41684 1727204464.43135: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:01:04 -0400 (0:00:00.079) 0:00:20.836 ***** 41684 1727204464.43474: entering _queue_task() for managed-node1/copy 41684 1727204464.44020: worker is 1 (out of 1 available) 41684 1727204464.44031: exiting _queue_task() for managed-node1/copy 41684 1727204464.44044: done queuing things up, now waiting for results queue to drain 41684 1727204464.44046: waiting for pending results... 41684 1727204464.44994: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41684 1727204464.45132: in run() - task 0affcd87-79f5-3839-086d-00000000002a 41684 1727204464.45156: variable 'ansible_search_path' from source: unknown 41684 1727204464.45166: variable 'ansible_search_path' from source: unknown 41684 1727204464.45206: calling self._execute() 41684 1727204464.45305: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204464.45316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204464.45330: variable 'omit' from source: magic vars 41684 1727204464.45703: variable 'ansible_distribution_major_version' from source: facts 41684 1727204464.45720: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204464.45839: variable 'network_provider' from source: set_fact 41684 1727204464.45850: Evaluated conditional (network_provider == "initscripts"): False 41684 1727204464.45856: when evaluation is False, skipping this task 41684 1727204464.45862: _execute() done 41684 1727204464.45871: dumping result to json 41684 1727204464.45877: done dumping result, returning 41684 1727204464.45887: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-3839-086d-00000000002a] 41684 1727204464.45902: sending task result for task 0affcd87-79f5-3839-086d-00000000002a skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 41684 1727204464.46052: no more pending results, returning what we have 41684 1727204464.46057: results queue empty 41684 1727204464.46058: checking for any_errors_fatal 41684 1727204464.46072: done checking for any_errors_fatal 41684 1727204464.46074: checking for max_fail_percentage 41684 1727204464.46076: done checking for max_fail_percentage 41684 1727204464.46076: checking to see if all hosts have failed and the running result is not ok 41684 1727204464.46077: done checking to see if all hosts have failed 41684 1727204464.46078: getting the remaining hosts for this loop 41684 1727204464.46080: done getting the remaining hosts for this loop 41684 1727204464.46084: getting the next task for host managed-node1 41684 1727204464.46091: done getting next task for host managed-node1 41684 1727204464.46096: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41684 1727204464.46099: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204464.46116: getting variables 41684 1727204464.46118: in VariableManager get_vars() 41684 1727204464.46162: Calling all_inventory to load vars for managed-node1 41684 1727204464.46167: Calling groups_inventory to load vars for managed-node1 41684 1727204464.46170: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204464.46182: Calling all_plugins_play to load vars for managed-node1 41684 1727204464.46185: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204464.46188: Calling groups_plugins_play to load vars for managed-node1 41684 1727204464.47587: done sending task result for task 0affcd87-79f5-3839-086d-00000000002a 41684 1727204464.47591: WORKER PROCESS EXITING 41684 1727204464.48687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204464.50542: done with get_vars() 41684 1727204464.50562: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:01:04 -0400 (0:00:00.071) 0:00:20.907 ***** 41684 1727204464.50629: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 41684 1727204464.50630: Creating lock for fedora.linux_system_roles.network_connections 41684 1727204464.50872: worker is 1 (out of 1 available) 41684 1727204464.50885: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 41684 1727204464.50899: done queuing things up, now waiting for results queue to drain 41684 1727204464.50900: waiting for pending results... 41684 1727204464.51088: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41684 1727204464.51181: in run() - task 0affcd87-79f5-3839-086d-00000000002b 41684 1727204464.51193: variable 'ansible_search_path' from source: unknown 41684 1727204464.51197: variable 'ansible_search_path' from source: unknown 41684 1727204464.51229: calling self._execute() 41684 1727204464.51299: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204464.51303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204464.51312: variable 'omit' from source: magic vars 41684 1727204464.51593: variable 'ansible_distribution_major_version' from source: facts 41684 1727204464.51605: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204464.51610: variable 'omit' from source: magic vars 41684 1727204464.51649: variable 'omit' from source: magic vars 41684 1727204464.51766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204464.54217: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204464.54293: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204464.54326: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204464.54369: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204464.54396: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204464.54487: variable 'network_provider' from source: set_fact 41684 1727204464.54625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204464.54669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204464.54799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204464.54802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204464.54805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204464.54889: variable 'omit' from source: magic vars 41684 1727204464.55041: variable 'omit' from source: magic vars 41684 1727204464.55044: variable 'network_connections' from source: task vars 41684 1727204464.55046: variable 'interface0' from source: play vars 41684 1727204464.55113: variable 'interface0' from source: play vars 41684 1727204464.55117: variable 'interface0' from source: play vars 41684 1727204464.55179: variable 'interface0' from source: play vars 41684 1727204464.55192: variable 'interface1' from source: play vars 41684 1727204464.55251: variable 'interface1' from source: play vars 41684 1727204464.55257: variable 'interface1' from source: play vars 41684 1727204464.55322: variable 'interface1' from source: play vars 41684 1727204464.55535: variable 'omit' from source: magic vars 41684 1727204464.55543: variable '__lsr_ansible_managed' from source: task vars 41684 1727204464.55606: variable '__lsr_ansible_managed' from source: task vars 41684 1727204464.55876: Loaded config def from plugin (lookup/template) 41684 1727204464.55880: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 41684 1727204464.55910: File lookup term: get_ansible_managed.j2 41684 1727204464.55912: variable 'ansible_search_path' from source: unknown 41684 1727204464.55915: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 41684 1727204464.55927: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 41684 1727204464.55948: variable 'ansible_search_path' from source: unknown 41684 1727204464.60223: variable 'ansible_managed' from source: unknown 41684 1727204464.60304: variable 'omit' from source: magic vars 41684 1727204464.60325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204464.60345: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204464.60361: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204464.60378: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204464.60385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204464.60405: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204464.60408: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204464.60412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204464.60480: Set connection var ansible_connection to ssh 41684 1727204464.60483: Set connection var ansible_pipelining to False 41684 1727204464.60490: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204464.60495: Set connection var ansible_timeout to 10 41684 1727204464.60501: Set connection var ansible_shell_executable to /bin/sh 41684 1727204464.60504: Set connection var ansible_shell_type to sh 41684 1727204464.60524: variable 'ansible_shell_executable' from source: unknown 41684 1727204464.60527: variable 'ansible_connection' from source: unknown 41684 1727204464.60529: variable 'ansible_module_compression' from source: unknown 41684 1727204464.60531: variable 'ansible_shell_type' from source: unknown 41684 1727204464.60535: variable 'ansible_shell_executable' from source: unknown 41684 1727204464.60537: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204464.60539: variable 'ansible_pipelining' from source: unknown 41684 1727204464.60541: variable 'ansible_timeout' from source: unknown 41684 1727204464.60552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204464.60639: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204464.60647: variable 'omit' from source: magic vars 41684 1727204464.60654: starting attempt loop 41684 1727204464.60657: running the handler 41684 1727204464.60674: _low_level_execute_command(): starting 41684 1727204464.60680: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204464.61180: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204464.61209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204464.61223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204464.61235: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204464.61288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204464.61294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204464.61310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204464.61379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204464.62947: stdout chunk (state=3): >>>/root <<< 41684 1727204464.63119: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204464.63140: stderr chunk (state=3): >>><<< 41684 1727204464.63150: stdout chunk (state=3): >>><<< 41684 1727204464.63188: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204464.63206: _low_level_execute_command(): starting 41684 1727204464.63218: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204464.6319501-43431-131528484259047 `" && echo ansible-tmp-1727204464.6319501-43431-131528484259047="` echo /root/.ansible/tmp/ansible-tmp-1727204464.6319501-43431-131528484259047 `" ) && sleep 0' 41684 1727204464.63945: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204464.63966: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204464.63984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204464.64004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204464.64056: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204464.64075: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204464.64090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204464.64109: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204464.64125: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204464.64137: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204464.64149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204464.64179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204464.64198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204464.64212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204464.64224: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204464.64243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204464.64331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204464.64352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204464.64373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204464.64469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204464.66300: stdout chunk (state=3): >>>ansible-tmp-1727204464.6319501-43431-131528484259047=/root/.ansible/tmp/ansible-tmp-1727204464.6319501-43431-131528484259047 <<< 41684 1727204464.66419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204464.66472: stderr chunk (state=3): >>><<< 41684 1727204464.66476: stdout chunk (state=3): >>><<< 41684 1727204464.66494: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204464.6319501-43431-131528484259047=/root/.ansible/tmp/ansible-tmp-1727204464.6319501-43431-131528484259047 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204464.66532: variable 'ansible_module_compression' from source: unknown 41684 1727204464.66574: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 41684 1727204464.66577: ANSIBALLZ: Acquiring lock 41684 1727204464.66580: ANSIBALLZ: Lock acquired: 139842512571568 41684 1727204464.66583: ANSIBALLZ: Creating module 41684 1727204464.89387: ANSIBALLZ: Writing module into payload 41684 1727204464.89856: ANSIBALLZ: Writing module 41684 1727204464.89888: ANSIBALLZ: Renaming module 41684 1727204464.89892: ANSIBALLZ: Done creating module 41684 1727204464.89917: variable 'ansible_facts' from source: unknown 41684 1727204464.90027: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204464.6319501-43431-131528484259047/AnsiballZ_network_connections.py 41684 1727204464.90177: Sending initial data 41684 1727204464.90180: Sent initial data (168 bytes) 41684 1727204464.91231: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204464.91242: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204464.91252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204464.91271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204464.91309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204464.91315: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204464.91325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204464.91338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204464.91345: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204464.91352: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204464.91359: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204464.91871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204464.91874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204464.91876: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204464.91878: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204464.91880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204464.91882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204464.91884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204464.91886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204464.91887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204464.93333: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204464.93387: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204464.93445: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpyewec1vt /root/.ansible/tmp/ansible-tmp-1727204464.6319501-43431-131528484259047/AnsiballZ_network_connections.py <<< 41684 1727204464.93493: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204464.95799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204464.95880: stderr chunk (state=3): >>><<< 41684 1727204464.95884: stdout chunk (state=3): >>><<< 41684 1727204464.95907: done transferring module to remote 41684 1727204464.95919: _low_level_execute_command(): starting 41684 1727204464.95923: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204464.6319501-43431-131528484259047/ /root/.ansible/tmp/ansible-tmp-1727204464.6319501-43431-131528484259047/AnsiballZ_network_connections.py && sleep 0' 41684 1727204464.97533: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204464.97656: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204464.97676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204464.97696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204464.97742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204464.97759: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204464.97778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204464.97832: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204464.97846: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204464.97863: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204464.97881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204464.97895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204464.97911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204464.97925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204464.97937: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204464.97951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204464.98128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204464.98146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204464.98181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204464.98306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204465.00094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204465.00097: stdout chunk (state=3): >>><<< 41684 1727204465.00099: stderr chunk (state=3): >>><<< 41684 1727204465.00196: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204465.00204: _low_level_execute_command(): starting 41684 1727204465.00206: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204464.6319501-43431-131528484259047/AnsiballZ_network_connections.py && sleep 0' 41684 1727204465.01530: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204465.01533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204465.01570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204465.01573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204465.01576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204465.01578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204465.01633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204465.01890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204465.01893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204465.01972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204465.33108: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 368423eb-f869-403f-a6af-2344dcd8e0b3\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 7d6131f1-a08f-4727-b007-3042c5fbcd66\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 368423eb-f869-403f-a6af-2344dcd8e0b3 (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 7d6131f1-a08f-4727-b007-3042c5fbcd66 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 41684 1727204465.35367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204465.35371: stdout chunk (state=3): >>><<< 41684 1727204465.35381: stderr chunk (state=3): >>><<< 41684 1727204465.35405: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 368423eb-f869-403f-a6af-2344dcd8e0b3\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 7d6131f1-a08f-4727-b007-3042c5fbcd66\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 368423eb-f869-403f-a6af-2344dcd8e0b3 (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 7d6131f1-a08f-4727-b007-3042c5fbcd66 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204465.35490: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.3/24', '2001:db8::2/32'], 'route': [{'network': '198.51.10.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4}, {'network': '2001:db6::4', 'prefix': 128, 'gateway': '2001:db8::1', 'metric': 2}]}}, {'name': 'ethtest1', 'interface_name': 'ethtest1', 'state': 'up', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.6/24', '2001:db8::4/32'], 'route': [{'network': '198.51.12.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204464.6319501-43431-131528484259047/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204465.35499: _low_level_execute_command(): starting 41684 1727204465.35504: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204464.6319501-43431-131528484259047/ > /dev/null 2>&1 && sleep 0' 41684 1727204465.36230: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204465.36246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204465.36257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204465.36278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204465.36319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204465.36326: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204465.36337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204465.36357: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204465.36368: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204465.36378: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204465.36386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204465.36395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204465.36406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204465.36413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204465.36419: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204465.36428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204465.36517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204465.36536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204465.36549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204465.36646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204465.38487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204465.38553: stderr chunk (state=3): >>><<< 41684 1727204465.38557: stdout chunk (state=3): >>><<< 41684 1727204465.38581: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204465.38587: handler run complete 41684 1727204465.38644: attempt loop complete, returning result 41684 1727204465.38647: _execute() done 41684 1727204465.38651: dumping result to json 41684 1727204465.38660: done dumping result, returning 41684 1727204465.38671: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-3839-086d-00000000002b] 41684 1727204465.38676: sending task result for task 0affcd87-79f5-3839-086d-00000000002b 41684 1727204465.38805: done sending task result for task 0affcd87-79f5-3839-086d-00000000002b 41684 1727204465.38808: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/24", "2001:db8::2/32" ], "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.10.64", "prefix": 26 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db6::4", "prefix": 128 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" }, { "autoconnect": false, "interface_name": "ethtest1", "ip": { "address": [ "198.51.100.6/24", "2001:db8::4/32" ], "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.12.128", "prefix": 26 } ] }, "name": "ethtest1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 368423eb-f869-403f-a6af-2344dcd8e0b3 [006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 7d6131f1-a08f-4727-b007-3042c5fbcd66 [007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 368423eb-f869-403f-a6af-2344dcd8e0b3 (not-active) [008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 7d6131f1-a08f-4727-b007-3042c5fbcd66 (not-active) 41684 1727204465.38981: no more pending results, returning what we have 41684 1727204465.38985: results queue empty 41684 1727204465.38986: checking for any_errors_fatal 41684 1727204465.38992: done checking for any_errors_fatal 41684 1727204465.38993: checking for max_fail_percentage 41684 1727204465.38994: done checking for max_fail_percentage 41684 1727204465.38995: checking to see if all hosts have failed and the running result is not ok 41684 1727204465.38995: done checking to see if all hosts have failed 41684 1727204465.38996: getting the remaining hosts for this loop 41684 1727204465.38998: done getting the remaining hosts for this loop 41684 1727204465.39002: getting the next task for host managed-node1 41684 1727204465.39007: done getting next task for host managed-node1 41684 1727204465.39011: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 41684 1727204465.39018: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204465.39028: getting variables 41684 1727204465.39029: in VariableManager get_vars() 41684 1727204465.39070: Calling all_inventory to load vars for managed-node1 41684 1727204465.39073: Calling groups_inventory to load vars for managed-node1 41684 1727204465.39074: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204465.39083: Calling all_plugins_play to load vars for managed-node1 41684 1727204465.39086: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204465.39088: Calling groups_plugins_play to load vars for managed-node1 41684 1727204465.40609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204465.42853: done with get_vars() 41684 1727204465.42916: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:01:05 -0400 (0:00:00.923) 0:00:21.831 ***** 41684 1727204465.43015: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 41684 1727204465.43022: Creating lock for fedora.linux_system_roles.network_state 41684 1727204465.43443: worker is 1 (out of 1 available) 41684 1727204465.43456: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 41684 1727204465.43474: done queuing things up, now waiting for results queue to drain 41684 1727204465.43480: waiting for pending results... 41684 1727204465.43778: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 41684 1727204465.43927: in run() - task 0affcd87-79f5-3839-086d-00000000002c 41684 1727204465.43949: variable 'ansible_search_path' from source: unknown 41684 1727204465.43957: variable 'ansible_search_path' from source: unknown 41684 1727204465.43999: calling self._execute() 41684 1727204465.44103: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204465.44115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204465.44139: variable 'omit' from source: magic vars 41684 1727204465.44542: variable 'ansible_distribution_major_version' from source: facts 41684 1727204465.44570: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204465.44714: variable 'network_state' from source: role '' defaults 41684 1727204465.44730: Evaluated conditional (network_state != {}): False 41684 1727204465.44738: when evaluation is False, skipping this task 41684 1727204465.44746: _execute() done 41684 1727204465.44754: dumping result to json 41684 1727204465.44761: done dumping result, returning 41684 1727204465.44778: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-3839-086d-00000000002c] 41684 1727204465.44798: sending task result for task 0affcd87-79f5-3839-086d-00000000002c skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41684 1727204465.44986: no more pending results, returning what we have 41684 1727204465.44991: results queue empty 41684 1727204465.44992: checking for any_errors_fatal 41684 1727204465.45007: done checking for any_errors_fatal 41684 1727204465.45008: checking for max_fail_percentage 41684 1727204465.45010: done checking for max_fail_percentage 41684 1727204465.45011: checking to see if all hosts have failed and the running result is not ok 41684 1727204465.45012: done checking to see if all hosts have failed 41684 1727204465.45012: getting the remaining hosts for this loop 41684 1727204465.45014: done getting the remaining hosts for this loop 41684 1727204465.45019: getting the next task for host managed-node1 41684 1727204465.45027: done getting next task for host managed-node1 41684 1727204465.45032: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41684 1727204465.45037: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204465.45053: getting variables 41684 1727204465.45055: in VariableManager get_vars() 41684 1727204465.45102: Calling all_inventory to load vars for managed-node1 41684 1727204465.45105: Calling groups_inventory to load vars for managed-node1 41684 1727204465.45108: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204465.45119: Calling all_plugins_play to load vars for managed-node1 41684 1727204465.45121: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204465.45123: Calling groups_plugins_play to load vars for managed-node1 41684 1727204465.46841: done sending task result for task 0affcd87-79f5-3839-086d-00000000002c 41684 1727204465.46845: WORKER PROCESS EXITING 41684 1727204465.48941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204465.52550: done with get_vars() 41684 1727204465.52587: done getting variables 41684 1727204465.52765: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:01:05 -0400 (0:00:00.097) 0:00:21.929 ***** 41684 1727204465.52801: entering _queue_task() for managed-node1/debug 41684 1727204465.53470: worker is 1 (out of 1 available) 41684 1727204465.53574: exiting _queue_task() for managed-node1/debug 41684 1727204465.53590: done queuing things up, now waiting for results queue to drain 41684 1727204465.53592: waiting for pending results... 41684 1727204465.54369: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41684 1727204465.54712: in run() - task 0affcd87-79f5-3839-086d-00000000002d 41684 1727204465.54729: variable 'ansible_search_path' from source: unknown 41684 1727204465.54733: variable 'ansible_search_path' from source: unknown 41684 1727204465.54776: calling self._execute() 41684 1727204465.54869: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204465.54878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204465.54888: variable 'omit' from source: magic vars 41684 1727204465.55656: variable 'ansible_distribution_major_version' from source: facts 41684 1727204465.55978: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204465.55982: variable 'omit' from source: magic vars 41684 1727204465.56044: variable 'omit' from source: magic vars 41684 1727204465.56084: variable 'omit' from source: magic vars 41684 1727204465.56124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204465.56159: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204465.56328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204465.56346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204465.56356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204465.56392: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204465.56395: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204465.56398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204465.56494: Set connection var ansible_connection to ssh 41684 1727204465.56500: Set connection var ansible_pipelining to False 41684 1727204465.56505: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204465.56511: Set connection var ansible_timeout to 10 41684 1727204465.56519: Set connection var ansible_shell_executable to /bin/sh 41684 1727204465.56521: Set connection var ansible_shell_type to sh 41684 1727204465.56547: variable 'ansible_shell_executable' from source: unknown 41684 1727204465.56550: variable 'ansible_connection' from source: unknown 41684 1727204465.56552: variable 'ansible_module_compression' from source: unknown 41684 1727204465.56555: variable 'ansible_shell_type' from source: unknown 41684 1727204465.56557: variable 'ansible_shell_executable' from source: unknown 41684 1727204465.56559: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204465.56561: variable 'ansible_pipelining' from source: unknown 41684 1727204465.56567: variable 'ansible_timeout' from source: unknown 41684 1727204465.56572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204465.57162: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204465.57177: variable 'omit' from source: magic vars 41684 1727204465.57180: starting attempt loop 41684 1727204465.57183: running the handler 41684 1727204465.57315: variable '__network_connections_result' from source: set_fact 41684 1727204465.57375: handler run complete 41684 1727204465.57391: attempt loop complete, returning result 41684 1727204465.57394: _execute() done 41684 1727204465.57397: dumping result to json 41684 1727204465.57399: done dumping result, returning 41684 1727204465.57407: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-3839-086d-00000000002d] 41684 1727204465.57413: sending task result for task 0affcd87-79f5-3839-086d-00000000002d 41684 1727204465.57503: done sending task result for task 0affcd87-79f5-3839-086d-00000000002d 41684 1727204465.57506: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 368423eb-f869-403f-a6af-2344dcd8e0b3", "[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 7d6131f1-a08f-4727-b007-3042c5fbcd66", "[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 368423eb-f869-403f-a6af-2344dcd8e0b3 (not-active)", "[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 7d6131f1-a08f-4727-b007-3042c5fbcd66 (not-active)" ] } 41684 1727204465.57599: no more pending results, returning what we have 41684 1727204465.57603: results queue empty 41684 1727204465.57604: checking for any_errors_fatal 41684 1727204465.57610: done checking for any_errors_fatal 41684 1727204465.57611: checking for max_fail_percentage 41684 1727204465.57613: done checking for max_fail_percentage 41684 1727204465.57613: checking to see if all hosts have failed and the running result is not ok 41684 1727204465.57616: done checking to see if all hosts have failed 41684 1727204465.57616: getting the remaining hosts for this loop 41684 1727204465.57618: done getting the remaining hosts for this loop 41684 1727204465.57623: getting the next task for host managed-node1 41684 1727204465.57629: done getting next task for host managed-node1 41684 1727204465.57633: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41684 1727204465.57636: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204465.57647: getting variables 41684 1727204465.57649: in VariableManager get_vars() 41684 1727204465.57694: Calling all_inventory to load vars for managed-node1 41684 1727204465.57697: Calling groups_inventory to load vars for managed-node1 41684 1727204465.57699: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204465.57707: Calling all_plugins_play to load vars for managed-node1 41684 1727204465.57709: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204465.57711: Calling groups_plugins_play to load vars for managed-node1 41684 1727204465.59825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204465.61530: done with get_vars() 41684 1727204465.61560: done getting variables 41684 1727204465.61628: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:01:05 -0400 (0:00:00.088) 0:00:22.018 ***** 41684 1727204465.61674: entering _queue_task() for managed-node1/debug 41684 1727204465.62137: worker is 1 (out of 1 available) 41684 1727204465.62150: exiting _queue_task() for managed-node1/debug 41684 1727204465.62166: done queuing things up, now waiting for results queue to drain 41684 1727204465.62167: waiting for pending results... 41684 1727204465.62785: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41684 1727204465.62791: in run() - task 0affcd87-79f5-3839-086d-00000000002e 41684 1727204465.62794: variable 'ansible_search_path' from source: unknown 41684 1727204465.62797: variable 'ansible_search_path' from source: unknown 41684 1727204465.62800: calling self._execute() 41684 1727204465.62803: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204465.62805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204465.62809: variable 'omit' from source: magic vars 41684 1727204465.63534: variable 'ansible_distribution_major_version' from source: facts 41684 1727204465.63538: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204465.63541: variable 'omit' from source: magic vars 41684 1727204465.63543: variable 'omit' from source: magic vars 41684 1727204465.63545: variable 'omit' from source: magic vars 41684 1727204465.63547: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204465.63551: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204465.63553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204465.63555: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204465.63558: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204465.63561: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204465.63563: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204465.63567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204465.63594: Set connection var ansible_connection to ssh 41684 1727204465.63601: Set connection var ansible_pipelining to False 41684 1727204465.63607: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204465.63613: Set connection var ansible_timeout to 10 41684 1727204465.63621: Set connection var ansible_shell_executable to /bin/sh 41684 1727204465.63624: Set connection var ansible_shell_type to sh 41684 1727204465.63649: variable 'ansible_shell_executable' from source: unknown 41684 1727204465.63652: variable 'ansible_connection' from source: unknown 41684 1727204465.63654: variable 'ansible_module_compression' from source: unknown 41684 1727204465.63657: variable 'ansible_shell_type' from source: unknown 41684 1727204465.63659: variable 'ansible_shell_executable' from source: unknown 41684 1727204465.63661: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204465.63669: variable 'ansible_pipelining' from source: unknown 41684 1727204465.63672: variable 'ansible_timeout' from source: unknown 41684 1727204465.63676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204465.63810: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204465.63820: variable 'omit' from source: magic vars 41684 1727204465.63825: starting attempt loop 41684 1727204465.63828: running the handler 41684 1727204465.63889: variable '__network_connections_result' from source: set_fact 41684 1727204465.64304: variable '__network_connections_result' from source: set_fact 41684 1727204465.64522: handler run complete 41684 1727204465.64560: attempt loop complete, returning result 41684 1727204465.64565: _execute() done 41684 1727204465.64571: dumping result to json 41684 1727204465.64579: done dumping result, returning 41684 1727204465.64587: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-3839-086d-00000000002e] 41684 1727204465.64592: sending task result for task 0affcd87-79f5-3839-086d-00000000002e ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/24", "2001:db8::2/32" ], "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.10.64", "prefix": 26 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db6::4", "prefix": 128 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" }, { "autoconnect": false, "interface_name": "ethtest1", "ip": { "address": [ "198.51.100.6/24", "2001:db8::4/32" ], "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.12.128", "prefix": 26 } ] }, "name": "ethtest1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 368423eb-f869-403f-a6af-2344dcd8e0b3\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 7d6131f1-a08f-4727-b007-3042c5fbcd66\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 368423eb-f869-403f-a6af-2344dcd8e0b3 (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 7d6131f1-a08f-4727-b007-3042c5fbcd66 (not-active)\n", "stderr_lines": [ "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 368423eb-f869-403f-a6af-2344dcd8e0b3", "[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 7d6131f1-a08f-4727-b007-3042c5fbcd66", "[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 368423eb-f869-403f-a6af-2344dcd8e0b3 (not-active)", "[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 7d6131f1-a08f-4727-b007-3042c5fbcd66 (not-active)" ] } } 41684 1727204465.64815: done sending task result for task 0affcd87-79f5-3839-086d-00000000002e 41684 1727204465.64870: WORKER PROCESS EXITING 41684 1727204465.64880: no more pending results, returning what we have 41684 1727204465.64884: results queue empty 41684 1727204465.64885: checking for any_errors_fatal 41684 1727204465.64894: done checking for any_errors_fatal 41684 1727204465.64895: checking for max_fail_percentage 41684 1727204465.64897: done checking for max_fail_percentage 41684 1727204465.64897: checking to see if all hosts have failed and the running result is not ok 41684 1727204465.64898: done checking to see if all hosts have failed 41684 1727204465.64899: getting the remaining hosts for this loop 41684 1727204465.64901: done getting the remaining hosts for this loop 41684 1727204465.64905: getting the next task for host managed-node1 41684 1727204465.64912: done getting next task for host managed-node1 41684 1727204465.64916: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41684 1727204465.64919: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204465.64931: getting variables 41684 1727204465.64933: in VariableManager get_vars() 41684 1727204465.64982: Calling all_inventory to load vars for managed-node1 41684 1727204465.64985: Calling groups_inventory to load vars for managed-node1 41684 1727204465.64987: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204465.65001: Calling all_plugins_play to load vars for managed-node1 41684 1727204465.65004: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204465.65007: Calling groups_plugins_play to load vars for managed-node1 41684 1727204465.67346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204465.71056: done with get_vars() 41684 1727204465.71098: done getting variables 41684 1727204465.71166: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:01:05 -0400 (0:00:00.095) 0:00:22.113 ***** 41684 1727204465.71200: entering _queue_task() for managed-node1/debug 41684 1727204465.71537: worker is 1 (out of 1 available) 41684 1727204465.71551: exiting _queue_task() for managed-node1/debug 41684 1727204465.71568: done queuing things up, now waiting for results queue to drain 41684 1727204465.71569: waiting for pending results... 41684 1727204465.71873: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41684 1727204465.72026: in run() - task 0affcd87-79f5-3839-086d-00000000002f 41684 1727204465.72049: variable 'ansible_search_path' from source: unknown 41684 1727204465.72057: variable 'ansible_search_path' from source: unknown 41684 1727204465.72104: calling self._execute() 41684 1727204465.72210: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204465.72221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204465.72240: variable 'omit' from source: magic vars 41684 1727204465.72627: variable 'ansible_distribution_major_version' from source: facts 41684 1727204465.72646: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204465.72785: variable 'network_state' from source: role '' defaults 41684 1727204465.72802: Evaluated conditional (network_state != {}): False 41684 1727204465.72810: when evaluation is False, skipping this task 41684 1727204465.72818: _execute() done 41684 1727204465.72826: dumping result to json 41684 1727204465.72832: done dumping result, returning 41684 1727204465.72845: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-3839-086d-00000000002f] 41684 1727204465.72857: sending task result for task 0affcd87-79f5-3839-086d-00000000002f 41684 1727204465.72974: done sending task result for task 0affcd87-79f5-3839-086d-00000000002f 41684 1727204465.72980: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 41684 1727204465.73028: no more pending results, returning what we have 41684 1727204465.73032: results queue empty 41684 1727204465.73033: checking for any_errors_fatal 41684 1727204465.73046: done checking for any_errors_fatal 41684 1727204465.73046: checking for max_fail_percentage 41684 1727204465.73048: done checking for max_fail_percentage 41684 1727204465.73049: checking to see if all hosts have failed and the running result is not ok 41684 1727204465.73050: done checking to see if all hosts have failed 41684 1727204465.73051: getting the remaining hosts for this loop 41684 1727204465.73053: done getting the remaining hosts for this loop 41684 1727204465.73057: getting the next task for host managed-node1 41684 1727204465.73068: done getting next task for host managed-node1 41684 1727204465.73072: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 41684 1727204465.73075: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204465.73090: getting variables 41684 1727204465.73092: in VariableManager get_vars() 41684 1727204465.73134: Calling all_inventory to load vars for managed-node1 41684 1727204465.73137: Calling groups_inventory to load vars for managed-node1 41684 1727204465.73139: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204465.73152: Calling all_plugins_play to load vars for managed-node1 41684 1727204465.73154: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204465.73157: Calling groups_plugins_play to load vars for managed-node1 41684 1727204465.74882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204465.76660: done with get_vars() 41684 1727204465.76687: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:01:05 -0400 (0:00:00.055) 0:00:22.169 ***** 41684 1727204465.76789: entering _queue_task() for managed-node1/ping 41684 1727204465.76791: Creating lock for ping 41684 1727204465.77138: worker is 1 (out of 1 available) 41684 1727204465.77151: exiting _queue_task() for managed-node1/ping 41684 1727204465.77168: done queuing things up, now waiting for results queue to drain 41684 1727204465.77170: waiting for pending results... 41684 1727204465.77450: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 41684 1727204465.77591: in run() - task 0affcd87-79f5-3839-086d-000000000030 41684 1727204465.77615: variable 'ansible_search_path' from source: unknown 41684 1727204465.77622: variable 'ansible_search_path' from source: unknown 41684 1727204465.77666: calling self._execute() 41684 1727204465.77758: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204465.77774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204465.77787: variable 'omit' from source: magic vars 41684 1727204465.78169: variable 'ansible_distribution_major_version' from source: facts 41684 1727204465.78187: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204465.78200: variable 'omit' from source: magic vars 41684 1727204465.78260: variable 'omit' from source: magic vars 41684 1727204465.78305: variable 'omit' from source: magic vars 41684 1727204465.78347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204465.78394: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204465.78419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204465.78439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204465.78454: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204465.78494: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204465.78501: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204465.78509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204465.78617: Set connection var ansible_connection to ssh 41684 1727204465.78628: Set connection var ansible_pipelining to False 41684 1727204465.78637: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204465.78646: Set connection var ansible_timeout to 10 41684 1727204465.78656: Set connection var ansible_shell_executable to /bin/sh 41684 1727204465.78666: Set connection var ansible_shell_type to sh 41684 1727204465.78693: variable 'ansible_shell_executable' from source: unknown 41684 1727204465.78701: variable 'ansible_connection' from source: unknown 41684 1727204465.78711: variable 'ansible_module_compression' from source: unknown 41684 1727204465.78717: variable 'ansible_shell_type' from source: unknown 41684 1727204465.78723: variable 'ansible_shell_executable' from source: unknown 41684 1727204465.78729: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204465.78735: variable 'ansible_pipelining' from source: unknown 41684 1727204465.78741: variable 'ansible_timeout' from source: unknown 41684 1727204465.78747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204465.78961: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204465.78981: variable 'omit' from source: magic vars 41684 1727204465.78991: starting attempt loop 41684 1727204465.78997: running the handler 41684 1727204465.79014: _low_level_execute_command(): starting 41684 1727204465.79029: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204465.79820: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204465.79835: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204465.79850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204465.79878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204465.79928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204465.79940: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204465.79954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204465.79979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204465.79992: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204465.80003: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204465.80019: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204465.80035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204465.80052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204465.80071: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204465.80084: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204465.80099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204465.80184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204465.80202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204465.80216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204465.80310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204465.81878: stdout chunk (state=3): >>>/root <<< 41684 1727204465.81981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204465.82063: stderr chunk (state=3): >>><<< 41684 1727204465.82069: stdout chunk (state=3): >>><<< 41684 1727204465.82177: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204465.82181: _low_level_execute_command(): starting 41684 1727204465.82184: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204465.8208823-43483-244141888494479 `" && echo ansible-tmp-1727204465.8208823-43483-244141888494479="` echo /root/.ansible/tmp/ansible-tmp-1727204465.8208823-43483-244141888494479 `" ) && sleep 0' 41684 1727204465.82754: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204465.82771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204465.82787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204465.82805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204465.82849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204465.82862: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204465.82880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204465.82898: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204465.82910: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204465.82920: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204465.82933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204465.82946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204465.82962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204465.82978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204465.82991: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204465.83005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204465.83087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204465.83105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204465.83119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204465.83216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204465.85070: stdout chunk (state=3): >>>ansible-tmp-1727204465.8208823-43483-244141888494479=/root/.ansible/tmp/ansible-tmp-1727204465.8208823-43483-244141888494479 <<< 41684 1727204465.85179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204465.85273: stderr chunk (state=3): >>><<< 41684 1727204465.85284: stdout chunk (state=3): >>><<< 41684 1727204465.85475: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204465.8208823-43483-244141888494479=/root/.ansible/tmp/ansible-tmp-1727204465.8208823-43483-244141888494479 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204465.85479: variable 'ansible_module_compression' from source: unknown 41684 1727204465.85481: ANSIBALLZ: Using lock for ping 41684 1727204465.85483: ANSIBALLZ: Acquiring lock 41684 1727204465.85485: ANSIBALLZ: Lock acquired: 139842512988704 41684 1727204465.85487: ANSIBALLZ: Creating module 41684 1727204465.98940: ANSIBALLZ: Writing module into payload 41684 1727204465.99016: ANSIBALLZ: Writing module 41684 1727204465.99042: ANSIBALLZ: Renaming module 41684 1727204465.99054: ANSIBALLZ: Done creating module 41684 1727204465.99078: variable 'ansible_facts' from source: unknown 41684 1727204465.99145: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204465.8208823-43483-244141888494479/AnsiballZ_ping.py 41684 1727204465.99304: Sending initial data 41684 1727204465.99308: Sent initial data (153 bytes) 41684 1727204466.00270: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204466.00287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.00302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.00320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.00366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204466.00381: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204466.00397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.00416: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204466.00427: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204466.00439: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204466.00454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.00474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.00492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.00506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204466.00519: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204466.00534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.00615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204466.00637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204466.00652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204466.00743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204466.02476: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204466.02518: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204466.02579: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmp6tfhwdcc /root/.ansible/tmp/ansible-tmp-1727204465.8208823-43483-244141888494479/AnsiballZ_ping.py <<< 41684 1727204466.02625: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204466.03993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204466.04260: stderr chunk (state=3): >>><<< 41684 1727204466.04268: stdout chunk (state=3): >>><<< 41684 1727204466.04270: done transferring module to remote 41684 1727204466.04272: _low_level_execute_command(): starting 41684 1727204466.04274: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204465.8208823-43483-244141888494479/ /root/.ansible/tmp/ansible-tmp-1727204465.8208823-43483-244141888494479/AnsiballZ_ping.py && sleep 0' 41684 1727204466.04869: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204466.04883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.04897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.04914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.04967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204466.04982: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204466.05000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.05019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204466.05036: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204466.05055: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204466.05074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.05091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.05109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.05122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204466.05134: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204466.05152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.05232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204466.05255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204466.05280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204466.05366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204466.07172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204466.07176: stdout chunk (state=3): >>><<< 41684 1727204466.07179: stderr chunk (state=3): >>><<< 41684 1727204466.07285: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204466.07289: _low_level_execute_command(): starting 41684 1727204466.07292: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204465.8208823-43483-244141888494479/AnsiballZ_ping.py && sleep 0' 41684 1727204466.08145: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.08151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.08188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204466.08192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204466.08195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.08265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204466.08581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204466.08685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204466.21484: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 41684 1727204466.22488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204466.22492: stdout chunk (state=3): >>><<< 41684 1727204466.22495: stderr chunk (state=3): >>><<< 41684 1727204466.22627: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204466.22632: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204465.8208823-43483-244141888494479/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204466.22638: _low_level_execute_command(): starting 41684 1727204466.22641: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204465.8208823-43483-244141888494479/ > /dev/null 2>&1 && sleep 0' 41684 1727204466.23260: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204466.23279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.23293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.23314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.23355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204466.23371: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204466.23385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.23404: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204466.23419: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204466.23436: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204466.23449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.23469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.23487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.23499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204466.23510: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204466.23524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.23610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204466.23635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204466.23661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204466.23791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204466.25556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204466.25608: stderr chunk (state=3): >>><<< 41684 1727204466.25611: stdout chunk (state=3): >>><<< 41684 1727204466.25626: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204466.25635: handler run complete 41684 1727204466.25647: attempt loop complete, returning result 41684 1727204466.25650: _execute() done 41684 1727204466.25652: dumping result to json 41684 1727204466.25656: done dumping result, returning 41684 1727204466.25670: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-3839-086d-000000000030] 41684 1727204466.25675: sending task result for task 0affcd87-79f5-3839-086d-000000000030 41684 1727204466.25769: done sending task result for task 0affcd87-79f5-3839-086d-000000000030 41684 1727204466.25772: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 41684 1727204466.25826: no more pending results, returning what we have 41684 1727204466.25830: results queue empty 41684 1727204466.25831: checking for any_errors_fatal 41684 1727204466.25838: done checking for any_errors_fatal 41684 1727204466.25838: checking for max_fail_percentage 41684 1727204466.25840: done checking for max_fail_percentage 41684 1727204466.25841: checking to see if all hosts have failed and the running result is not ok 41684 1727204466.25841: done checking to see if all hosts have failed 41684 1727204466.25842: getting the remaining hosts for this loop 41684 1727204466.25844: done getting the remaining hosts for this loop 41684 1727204466.25848: getting the next task for host managed-node1 41684 1727204466.25857: done getting next task for host managed-node1 41684 1727204466.25861: ^ task is: TASK: meta (role_complete) 41684 1727204466.25866: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204466.25876: getting variables 41684 1727204466.25879: in VariableManager get_vars() 41684 1727204466.25922: Calling all_inventory to load vars for managed-node1 41684 1727204466.25925: Calling groups_inventory to load vars for managed-node1 41684 1727204466.25927: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204466.25938: Calling all_plugins_play to load vars for managed-node1 41684 1727204466.25940: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204466.25942: Calling groups_plugins_play to load vars for managed-node1 41684 1727204466.27453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204466.29004: done with get_vars() 41684 1727204466.29025: done getting variables 41684 1727204466.29090: done queuing things up, now waiting for results queue to drain 41684 1727204466.29092: results queue empty 41684 1727204466.29092: checking for any_errors_fatal 41684 1727204466.29094: done checking for any_errors_fatal 41684 1727204466.29095: checking for max_fail_percentage 41684 1727204466.29095: done checking for max_fail_percentage 41684 1727204466.29096: checking to see if all hosts have failed and the running result is not ok 41684 1727204466.29096: done checking to see if all hosts have failed 41684 1727204466.29097: getting the remaining hosts for this loop 41684 1727204466.29097: done getting the remaining hosts for this loop 41684 1727204466.29100: getting the next task for host managed-node1 41684 1727204466.29102: done getting next task for host managed-node1 41684 1727204466.29104: ^ task is: TASK: Get the IPv4 routes from the route table main 41684 1727204466.29105: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204466.29106: getting variables 41684 1727204466.29110: in VariableManager get_vars() 41684 1727204466.29122: Calling all_inventory to load vars for managed-node1 41684 1727204466.29124: Calling groups_inventory to load vars for managed-node1 41684 1727204466.29125: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204466.29128: Calling all_plugins_play to load vars for managed-node1 41684 1727204466.29130: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204466.29131: Calling groups_plugins_play to load vars for managed-node1 41684 1727204466.30357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204466.33700: done with get_vars() 41684 1727204466.33731: done getting variables 41684 1727204466.34578: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the IPv4 routes from the route table main] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:73 Tuesday 24 September 2024 15:01:06 -0400 (0:00:00.578) 0:00:22.747 ***** 41684 1727204466.34621: entering _queue_task() for managed-node1/command 41684 1727204466.35472: worker is 1 (out of 1 available) 41684 1727204466.35485: exiting _queue_task() for managed-node1/command 41684 1727204466.35499: done queuing things up, now waiting for results queue to drain 41684 1727204466.35501: waiting for pending results... 41684 1727204466.36378: running TaskExecutor() for managed-node1/TASK: Get the IPv4 routes from the route table main 41684 1727204466.36595: in run() - task 0affcd87-79f5-3839-086d-000000000060 41684 1727204466.36736: variable 'ansible_search_path' from source: unknown 41684 1727204466.36782: calling self._execute() 41684 1727204466.36997: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204466.37008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204466.37023: variable 'omit' from source: magic vars 41684 1727204466.37746: variable 'ansible_distribution_major_version' from source: facts 41684 1727204466.37824: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204466.37926: variable 'omit' from source: magic vars 41684 1727204466.37953: variable 'omit' from source: magic vars 41684 1727204466.37996: variable 'omit' from source: magic vars 41684 1727204466.38094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204466.38251: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204466.38280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204466.38303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204466.38319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204466.38474: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204466.38484: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204466.38493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204466.38602: Set connection var ansible_connection to ssh 41684 1727204466.38686: Set connection var ansible_pipelining to False 41684 1727204466.38698: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204466.38708: Set connection var ansible_timeout to 10 41684 1727204466.38722: Set connection var ansible_shell_executable to /bin/sh 41684 1727204466.38728: Set connection var ansible_shell_type to sh 41684 1727204466.38814: variable 'ansible_shell_executable' from source: unknown 41684 1727204466.38822: variable 'ansible_connection' from source: unknown 41684 1727204466.38828: variable 'ansible_module_compression' from source: unknown 41684 1727204466.38835: variable 'ansible_shell_type' from source: unknown 41684 1727204466.38839: variable 'ansible_shell_executable' from source: unknown 41684 1727204466.38846: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204466.38901: variable 'ansible_pipelining' from source: unknown 41684 1727204466.38909: variable 'ansible_timeout' from source: unknown 41684 1727204466.38916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204466.39174: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204466.39231: variable 'omit' from source: magic vars 41684 1727204466.39336: starting attempt loop 41684 1727204466.39345: running the handler 41684 1727204466.39368: _low_level_execute_command(): starting 41684 1727204466.39384: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204466.41437: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.41444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.41460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.41590: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.41594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.41672: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204466.41675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204466.41677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204466.41746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204466.43286: stdout chunk (state=3): >>>/root <<< 41684 1727204466.43389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204466.43475: stderr chunk (state=3): >>><<< 41684 1727204466.43478: stdout chunk (state=3): >>><<< 41684 1727204466.43608: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204466.43612: _low_level_execute_command(): starting 41684 1727204466.43615: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204466.4350443-43508-45139190424100 `" && echo ansible-tmp-1727204466.4350443-43508-45139190424100="` echo /root/.ansible/tmp/ansible-tmp-1727204466.4350443-43508-45139190424100 `" ) && sleep 0' 41684 1727204466.44376: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204466.44393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.44409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.44428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.44478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204466.44505: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204466.44519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.44539: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204466.44552: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204466.44573: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204466.44588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.44610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.44627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.44641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204466.44654: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204466.44670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.44752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204466.44784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204466.44787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204466.44882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204466.46708: stdout chunk (state=3): >>>ansible-tmp-1727204466.4350443-43508-45139190424100=/root/.ansible/tmp/ansible-tmp-1727204466.4350443-43508-45139190424100 <<< 41684 1727204466.46902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204466.46906: stdout chunk (state=3): >>><<< 41684 1727204466.46913: stderr chunk (state=3): >>><<< 41684 1727204466.46934: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204466.4350443-43508-45139190424100=/root/.ansible/tmp/ansible-tmp-1727204466.4350443-43508-45139190424100 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204466.46974: variable 'ansible_module_compression' from source: unknown 41684 1727204466.47030: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204466.47062: variable 'ansible_facts' from source: unknown 41684 1727204466.47150: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204466.4350443-43508-45139190424100/AnsiballZ_command.py 41684 1727204466.47304: Sending initial data 41684 1727204466.47308: Sent initial data (155 bytes) 41684 1727204466.48289: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204466.48299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.48309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.48324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.48362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204466.48381: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204466.48392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.48405: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204466.48413: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204466.48420: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204466.48428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.48437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.48451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.48458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204466.48470: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204466.48486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.48869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204466.48873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204466.48875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204466.48877: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204466.50515: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204466.50560: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204466.50606: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmp0nq5gukx /root/.ansible/tmp/ansible-tmp-1727204466.4350443-43508-45139190424100/AnsiballZ_command.py <<< 41684 1727204466.50651: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204466.51993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204466.51997: stderr chunk (state=3): >>><<< 41684 1727204466.51999: stdout chunk (state=3): >>><<< 41684 1727204466.52004: done transferring module to remote 41684 1727204466.52006: _low_level_execute_command(): starting 41684 1727204466.52009: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204466.4350443-43508-45139190424100/ /root/.ansible/tmp/ansible-tmp-1727204466.4350443-43508-45139190424100/AnsiballZ_command.py && sleep 0' 41684 1727204466.53579: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.53649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.53657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.53659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204466.53662: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204466.53678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.53690: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204466.53697: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204466.53703: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204466.53710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.53718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.53739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.53746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204466.53752: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204466.53761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.53907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204466.53910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204466.53912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204466.53978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204466.55663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204466.55745: stderr chunk (state=3): >>><<< 41684 1727204466.55751: stdout chunk (state=3): >>><<< 41684 1727204466.55775: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204466.55779: _low_level_execute_command(): starting 41684 1727204466.55784: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204466.4350443-43508-45139190424100/AnsiballZ_command.py && sleep 0' 41684 1727204466.56540: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204466.56555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.56576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.56596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.56636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204466.56651: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204466.56668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.56686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204466.56698: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204466.56708: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204466.56719: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.56732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.56748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.56761: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204466.56775: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204466.56788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.56862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204466.56882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204466.56895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204466.56998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204466.70363: stdout chunk (state=3): >>> {"changed": true, "stdout": "default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.148 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.148 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \n198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 \n198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 \n198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 ", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "route"], "start": "2024-09-24 15:01:06.699311", "end": "2024-09-24 15:01:06.702801", "delta": "0:00:00.003490", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204466.71522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204466.71526: stdout chunk (state=3): >>><<< 41684 1727204466.71532: stderr chunk (state=3): >>><<< 41684 1727204466.71555: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.148 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.148 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \n198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 \n198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 \n198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 ", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "route"], "start": "2024-09-24 15:01:06.699311", "end": "2024-09-24 15:01:06.702801", "delta": "0:00:00.003490", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204466.71602: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204466.4350443-43508-45139190424100/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204466.71609: _low_level_execute_command(): starting 41684 1727204466.71614: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204466.4350443-43508-45139190424100/ > /dev/null 2>&1 && sleep 0' 41684 1727204466.72440: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204466.72457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.72476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.72495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.72541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204466.72558: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204466.72576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.72594: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204466.72607: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204466.72619: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204466.72636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.72651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.72671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.72689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204466.72703: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204466.72718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.72804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204466.72828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204466.72847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204466.72977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204466.74769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204466.74791: stdout chunk (state=3): >>><<< 41684 1727204466.74794: stderr chunk (state=3): >>><<< 41684 1727204466.74971: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204466.74975: handler run complete 41684 1727204466.74978: Evaluated conditional (False): False 41684 1727204466.74980: attempt loop complete, returning result 41684 1727204466.74983: _execute() done 41684 1727204466.74986: dumping result to json 41684 1727204466.74988: done dumping result, returning 41684 1727204466.74991: done running TaskExecutor() for managed-node1/TASK: Get the IPv4 routes from the route table main [0affcd87-79f5-3839-086d-000000000060] 41684 1727204466.74993: sending task result for task 0affcd87-79f5-3839-086d-000000000060 41684 1727204466.75139: done sending task result for task 0affcd87-79f5-3839-086d-000000000060 41684 1727204466.75143: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "-4", "route" ], "delta": "0:00:00.003490", "end": "2024-09-24 15:01:06.702801", "rc": 0, "start": "2024-09-24 15:01:06.699311" } STDOUT: default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.148 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.148 metric 100 192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown 198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 41684 1727204466.75227: no more pending results, returning what we have 41684 1727204466.75231: results queue empty 41684 1727204466.75232: checking for any_errors_fatal 41684 1727204466.75235: done checking for any_errors_fatal 41684 1727204466.75236: checking for max_fail_percentage 41684 1727204466.75237: done checking for max_fail_percentage 41684 1727204466.75238: checking to see if all hosts have failed and the running result is not ok 41684 1727204466.75239: done checking to see if all hosts have failed 41684 1727204466.75240: getting the remaining hosts for this loop 41684 1727204466.75242: done getting the remaining hosts for this loop 41684 1727204466.75246: getting the next task for host managed-node1 41684 1727204466.75254: done getting next task for host managed-node1 41684 1727204466.75257: ^ task is: TASK: Assert that the route table main contains the specified IPv4 routes 41684 1727204466.75259: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204466.75263: getting variables 41684 1727204466.75266: in VariableManager get_vars() 41684 1727204466.75310: Calling all_inventory to load vars for managed-node1 41684 1727204466.75312: Calling groups_inventory to load vars for managed-node1 41684 1727204466.75315: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204466.75326: Calling all_plugins_play to load vars for managed-node1 41684 1727204466.75329: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204466.75332: Calling groups_plugins_play to load vars for managed-node1 41684 1727204466.76948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204466.78639: done with get_vars() 41684 1727204466.78674: done getting variables 41684 1727204466.78735: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the route table main contains the specified IPv4 routes] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:78 Tuesday 24 September 2024 15:01:06 -0400 (0:00:00.441) 0:00:23.189 ***** 41684 1727204466.78771: entering _queue_task() for managed-node1/assert 41684 1727204466.79090: worker is 1 (out of 1 available) 41684 1727204466.79103: exiting _queue_task() for managed-node1/assert 41684 1727204466.79115: done queuing things up, now waiting for results queue to drain 41684 1727204466.79117: waiting for pending results... 41684 1727204466.79395: running TaskExecutor() for managed-node1/TASK: Assert that the route table main contains the specified IPv4 routes 41684 1727204466.79508: in run() - task 0affcd87-79f5-3839-086d-000000000061 41684 1727204466.79530: variable 'ansible_search_path' from source: unknown 41684 1727204466.79575: calling self._execute() 41684 1727204466.79674: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204466.79686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204466.79700: variable 'omit' from source: magic vars 41684 1727204466.80074: variable 'ansible_distribution_major_version' from source: facts 41684 1727204466.80094: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204466.80109: variable 'omit' from source: magic vars 41684 1727204466.80136: variable 'omit' from source: magic vars 41684 1727204466.80177: variable 'omit' from source: magic vars 41684 1727204466.80228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204466.80265: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204466.80289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204466.80306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204466.80323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204466.80353: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204466.80360: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204466.80369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204466.80469: Set connection var ansible_connection to ssh 41684 1727204466.80483: Set connection var ansible_pipelining to False 41684 1727204466.80494: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204466.80504: Set connection var ansible_timeout to 10 41684 1727204466.80515: Set connection var ansible_shell_executable to /bin/sh 41684 1727204466.80522: Set connection var ansible_shell_type to sh 41684 1727204466.80555: variable 'ansible_shell_executable' from source: unknown 41684 1727204466.80567: variable 'ansible_connection' from source: unknown 41684 1727204466.80575: variable 'ansible_module_compression' from source: unknown 41684 1727204466.80582: variable 'ansible_shell_type' from source: unknown 41684 1727204466.80588: variable 'ansible_shell_executable' from source: unknown 41684 1727204466.80594: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204466.80601: variable 'ansible_pipelining' from source: unknown 41684 1727204466.80607: variable 'ansible_timeout' from source: unknown 41684 1727204466.80614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204466.80767: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204466.80785: variable 'omit' from source: magic vars 41684 1727204466.80795: starting attempt loop 41684 1727204466.80802: running the handler 41684 1727204466.80985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204466.81233: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204466.81284: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204466.81374: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204466.81419: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204466.81512: variable 'route_table_main_ipv4' from source: set_fact 41684 1727204466.81554: Evaluated conditional (route_table_main_ipv4.stdout is search("198.51.10.64/26 via 198.51.100.6 dev ethtest0\s+(proto static )?metric 4")): True 41684 1727204466.81705: variable 'route_table_main_ipv4' from source: set_fact 41684 1727204466.81745: Evaluated conditional (route_table_main_ipv4.stdout is search("198.51.12.128/26 via 198.51.100.1 dev ethtest1\s+(proto static )?metric 2")): True 41684 1727204466.81756: handler run complete 41684 1727204466.81779: attempt loop complete, returning result 41684 1727204466.81787: _execute() done 41684 1727204466.81795: dumping result to json 41684 1727204466.81801: done dumping result, returning 41684 1727204466.81812: done running TaskExecutor() for managed-node1/TASK: Assert that the route table main contains the specified IPv4 routes [0affcd87-79f5-3839-086d-000000000061] 41684 1727204466.81822: sending task result for task 0affcd87-79f5-3839-086d-000000000061 41684 1727204466.81936: done sending task result for task 0affcd87-79f5-3839-086d-000000000061 41684 1727204466.81944: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 41684 1727204466.82000: no more pending results, returning what we have 41684 1727204466.82005: results queue empty 41684 1727204466.82006: checking for any_errors_fatal 41684 1727204466.82016: done checking for any_errors_fatal 41684 1727204466.82017: checking for max_fail_percentage 41684 1727204466.82019: done checking for max_fail_percentage 41684 1727204466.82020: checking to see if all hosts have failed and the running result is not ok 41684 1727204466.82021: done checking to see if all hosts have failed 41684 1727204466.82022: getting the remaining hosts for this loop 41684 1727204466.82024: done getting the remaining hosts for this loop 41684 1727204466.82028: getting the next task for host managed-node1 41684 1727204466.82036: done getting next task for host managed-node1 41684 1727204466.82041: ^ task is: TASK: Get the IPv6 routes from the route table main 41684 1727204466.82043: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204466.82047: getting variables 41684 1727204466.82049: in VariableManager get_vars() 41684 1727204466.82096: Calling all_inventory to load vars for managed-node1 41684 1727204466.82099: Calling groups_inventory to load vars for managed-node1 41684 1727204466.82102: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204466.82114: Calling all_plugins_play to load vars for managed-node1 41684 1727204466.82117: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204466.82120: Calling groups_plugins_play to load vars for managed-node1 41684 1727204466.87594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204466.88499: done with get_vars() 41684 1727204466.88517: done getting variables 41684 1727204466.88554: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the IPv6 routes from the route table main] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:89 Tuesday 24 September 2024 15:01:06 -0400 (0:00:00.098) 0:00:23.287 ***** 41684 1727204466.88578: entering _queue_task() for managed-node1/command 41684 1727204466.88859: worker is 1 (out of 1 available) 41684 1727204466.88873: exiting _queue_task() for managed-node1/command 41684 1727204466.88886: done queuing things up, now waiting for results queue to drain 41684 1727204466.88887: waiting for pending results... 41684 1727204466.89181: running TaskExecutor() for managed-node1/TASK: Get the IPv6 routes from the route table main 41684 1727204466.89292: in run() - task 0affcd87-79f5-3839-086d-000000000062 41684 1727204466.89312: variable 'ansible_search_path' from source: unknown 41684 1727204466.89358: calling self._execute() 41684 1727204466.89472: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204466.89487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204466.89501: variable 'omit' from source: magic vars 41684 1727204466.89901: variable 'ansible_distribution_major_version' from source: facts 41684 1727204466.89922: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204466.89935: variable 'omit' from source: magic vars 41684 1727204466.89962: variable 'omit' from source: magic vars 41684 1727204466.90013: variable 'omit' from source: magic vars 41684 1727204466.90062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204466.90109: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204466.90138: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204466.90160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204466.90177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204466.90214: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204466.90222: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204466.90229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204466.90333: Set connection var ansible_connection to ssh 41684 1727204466.90345: Set connection var ansible_pipelining to False 41684 1727204466.90355: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204466.90367: Set connection var ansible_timeout to 10 41684 1727204466.90380: Set connection var ansible_shell_executable to /bin/sh 41684 1727204466.90388: Set connection var ansible_shell_type to sh 41684 1727204466.90422: variable 'ansible_shell_executable' from source: unknown 41684 1727204466.90431: variable 'ansible_connection' from source: unknown 41684 1727204466.90438: variable 'ansible_module_compression' from source: unknown 41684 1727204466.90447: variable 'ansible_shell_type' from source: unknown 41684 1727204466.90456: variable 'ansible_shell_executable' from source: unknown 41684 1727204466.90463: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204466.90474: variable 'ansible_pipelining' from source: unknown 41684 1727204466.90482: variable 'ansible_timeout' from source: unknown 41684 1727204466.90491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204466.90645: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204466.90663: variable 'omit' from source: magic vars 41684 1727204466.90676: starting attempt loop 41684 1727204466.90683: running the handler 41684 1727204466.90704: _low_level_execute_command(): starting 41684 1727204466.90716: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204466.91293: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.91316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.91330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.91381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204466.91401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204466.91455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204466.93011: stdout chunk (state=3): >>>/root <<< 41684 1727204466.93117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204466.93219: stderr chunk (state=3): >>><<< 41684 1727204466.93232: stdout chunk (state=3): >>><<< 41684 1727204466.93268: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204466.93288: _low_level_execute_command(): starting 41684 1727204466.93298: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204466.932762-43538-190013243628732 `" && echo ansible-tmp-1727204466.932762-43538-190013243628732="` echo /root/.ansible/tmp/ansible-tmp-1727204466.932762-43538-190013243628732 `" ) && sleep 0' 41684 1727204466.94008: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204466.94024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.94042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.94062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.94119: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204466.94131: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204466.94146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.94175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204466.94196: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204466.94208: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204466.94221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204466.94234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.94249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.94260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204466.94274: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204466.94294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.94375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204466.94403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204466.94423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204466.94513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204466.96353: stdout chunk (state=3): >>>ansible-tmp-1727204466.932762-43538-190013243628732=/root/.ansible/tmp/ansible-tmp-1727204466.932762-43538-190013243628732 <<< 41684 1727204466.96476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204466.96557: stderr chunk (state=3): >>><<< 41684 1727204466.96562: stdout chunk (state=3): >>><<< 41684 1727204466.96647: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204466.932762-43538-190013243628732=/root/.ansible/tmp/ansible-tmp-1727204466.932762-43538-190013243628732 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204466.96652: variable 'ansible_module_compression' from source: unknown 41684 1727204466.96890: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204466.96894: variable 'ansible_facts' from source: unknown 41684 1727204466.96896: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204466.932762-43538-190013243628732/AnsiballZ_command.py 41684 1727204466.96956: Sending initial data 41684 1727204466.96960: Sent initial data (155 bytes) 41684 1727204466.97899: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204466.97903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204466.97947: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.97951: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204466.97953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204466.98016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204466.98019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204466.98023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204466.98083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204466.99819: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204466.99876: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204466.99943: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmplk4e_9h9 /root/.ansible/tmp/ansible-tmp-1727204466.932762-43538-190013243628732/AnsiballZ_command.py <<< 41684 1727204467.00006: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204467.00869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204467.00988: stderr chunk (state=3): >>><<< 41684 1727204467.00991: stdout chunk (state=3): >>><<< 41684 1727204467.01010: done transferring module to remote 41684 1727204467.01019: _low_level_execute_command(): starting 41684 1727204467.01024: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204466.932762-43538-190013243628732/ /root/.ansible/tmp/ansible-tmp-1727204466.932762-43538-190013243628732/AnsiballZ_command.py && sleep 0' 41684 1727204467.01492: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204467.01515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204467.01528: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204467.01539: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204467.01592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204467.01606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204467.01668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204467.03368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204467.03429: stderr chunk (state=3): >>><<< 41684 1727204467.03434: stdout chunk (state=3): >>><<< 41684 1727204467.03455: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204467.03459: _low_level_execute_command(): starting 41684 1727204467.03462: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204466.932762-43538-190013243628732/AnsiballZ_command.py && sleep 0' 41684 1727204467.04084: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204467.04145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204467.17505: stdout chunk (state=3): >>> {"changed": true, "stdout": "::1 dev lo proto kernel metric 256 pref medium\n2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium\n2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium\n2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nfe80::/64 dev peerethtest0 proto kernel metric 256 pref medium\nfe80::/64 dev peerethtest1 proto kernel metric 256 pref medium\nfe80::/64 dev ethtest0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest1 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-24 15:01:07.171034", "end": "2024-09-24 15:01:07.174180", "delta": "0:00:00.003146", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204467.18677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204467.18681: stdout chunk (state=3): >>><<< 41684 1727204467.18683: stderr chunk (state=3): >>><<< 41684 1727204467.18831: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "::1 dev lo proto kernel metric 256 pref medium\n2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium\n2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium\n2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nfe80::/64 dev peerethtest0 proto kernel metric 256 pref medium\nfe80::/64 dev peerethtest1 proto kernel metric 256 pref medium\nfe80::/64 dev ethtest0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest1 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-24 15:01:07.171034", "end": "2024-09-24 15:01:07.174180", "delta": "0:00:00.003146", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204467.18836: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204466.932762-43538-190013243628732/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204467.18839: _low_level_execute_command(): starting 41684 1727204467.18842: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204466.932762-43538-190013243628732/ > /dev/null 2>&1 && sleep 0' 41684 1727204467.20044: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204467.20048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204467.20070: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204467.20385: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204467.20395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204467.20409: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204467.20417: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204467.20425: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204467.20434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204467.20445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204467.20460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204467.20472: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204467.20480: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204467.20489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204467.20560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204467.20638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204467.20652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204467.20746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204467.22609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204467.22614: stdout chunk (state=3): >>><<< 41684 1727204467.22622: stderr chunk (state=3): >>><<< 41684 1727204467.22644: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204467.22647: handler run complete 41684 1727204467.22680: Evaluated conditional (False): False 41684 1727204467.22690: attempt loop complete, returning result 41684 1727204467.22693: _execute() done 41684 1727204467.22696: dumping result to json 41684 1727204467.22701: done dumping result, returning 41684 1727204467.22710: done running TaskExecutor() for managed-node1/TASK: Get the IPv6 routes from the route table main [0affcd87-79f5-3839-086d-000000000062] 41684 1727204467.22716: sending task result for task 0affcd87-79f5-3839-086d-000000000062 41684 1727204467.22844: done sending task result for task 0affcd87-79f5-3839-086d-000000000062 41684 1727204467.22848: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003146", "end": "2024-09-24 15:01:07.174180", "rc": 0, "start": "2024-09-24 15:01:07.171034" } STDOUT: ::1 dev lo proto kernel metric 256 pref medium 2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium 2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium 2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium fe80::/64 dev peerethtest0 proto kernel metric 256 pref medium fe80::/64 dev peerethtest1 proto kernel metric 256 pref medium fe80::/64 dev ethtest0 proto kernel metric 1024 pref medium fe80::/64 dev ethtest1 proto kernel metric 1024 pref medium 41684 1727204467.22925: no more pending results, returning what we have 41684 1727204467.22929: results queue empty 41684 1727204467.22930: checking for any_errors_fatal 41684 1727204467.22940: done checking for any_errors_fatal 41684 1727204467.22941: checking for max_fail_percentage 41684 1727204467.22942: done checking for max_fail_percentage 41684 1727204467.22943: checking to see if all hosts have failed and the running result is not ok 41684 1727204467.22944: done checking to see if all hosts have failed 41684 1727204467.22945: getting the remaining hosts for this loop 41684 1727204467.22946: done getting the remaining hosts for this loop 41684 1727204467.22950: getting the next task for host managed-node1 41684 1727204467.22956: done getting next task for host managed-node1 41684 1727204467.22958: ^ task is: TASK: Assert that the route table main contains the specified IPv6 routes 41684 1727204467.22960: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204467.22963: getting variables 41684 1727204467.22967: in VariableManager get_vars() 41684 1727204467.23006: Calling all_inventory to load vars for managed-node1 41684 1727204467.23008: Calling groups_inventory to load vars for managed-node1 41684 1727204467.23010: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204467.23020: Calling all_plugins_play to load vars for managed-node1 41684 1727204467.23022: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204467.23024: Calling groups_plugins_play to load vars for managed-node1 41684 1727204467.24901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204467.27203: done with get_vars() 41684 1727204467.27235: done getting variables 41684 1727204467.27306: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the route table main contains the specified IPv6 routes] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:94 Tuesday 24 September 2024 15:01:07 -0400 (0:00:00.387) 0:00:23.674 ***** 41684 1727204467.27336: entering _queue_task() for managed-node1/assert 41684 1727204467.27802: worker is 1 (out of 1 available) 41684 1727204467.27816: exiting _queue_task() for managed-node1/assert 41684 1727204467.27831: done queuing things up, now waiting for results queue to drain 41684 1727204467.27832: waiting for pending results... 41684 1727204467.28221: running TaskExecutor() for managed-node1/TASK: Assert that the route table main contains the specified IPv6 routes 41684 1727204467.28320: in run() - task 0affcd87-79f5-3839-086d-000000000063 41684 1727204467.28334: variable 'ansible_search_path' from source: unknown 41684 1727204467.28378: calling self._execute() 41684 1727204467.28479: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204467.28483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204467.28493: variable 'omit' from source: magic vars 41684 1727204467.28902: variable 'ansible_distribution_major_version' from source: facts 41684 1727204467.28916: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204467.28923: variable 'omit' from source: magic vars 41684 1727204467.28948: variable 'omit' from source: magic vars 41684 1727204467.28988: variable 'omit' from source: magic vars 41684 1727204467.29034: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204467.29078: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204467.29101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204467.29121: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204467.29131: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204467.29166: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204467.29173: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204467.29176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204467.29285: Set connection var ansible_connection to ssh 41684 1727204467.29291: Set connection var ansible_pipelining to False 41684 1727204467.29296: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204467.29302: Set connection var ansible_timeout to 10 41684 1727204467.29309: Set connection var ansible_shell_executable to /bin/sh 41684 1727204467.29312: Set connection var ansible_shell_type to sh 41684 1727204467.29344: variable 'ansible_shell_executable' from source: unknown 41684 1727204467.29347: variable 'ansible_connection' from source: unknown 41684 1727204467.29349: variable 'ansible_module_compression' from source: unknown 41684 1727204467.29352: variable 'ansible_shell_type' from source: unknown 41684 1727204467.29354: variable 'ansible_shell_executable' from source: unknown 41684 1727204467.29356: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204467.29361: variable 'ansible_pipelining' from source: unknown 41684 1727204467.29372: variable 'ansible_timeout' from source: unknown 41684 1727204467.29377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204467.29521: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204467.29531: variable 'omit' from source: magic vars 41684 1727204467.29536: starting attempt loop 41684 1727204467.29539: running the handler 41684 1727204467.29729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204467.30118: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204467.30522: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204467.30942: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204467.31024: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204467.31119: variable 'route_table_main_ipv6' from source: set_fact 41684 1727204467.31170: Evaluated conditional (route_table_main_ipv6.stdout is search("2001:db6::4 via 2001:db8::1 dev ethtest0\s+(proto static )?metric 2")): True 41684 1727204467.31179: handler run complete 41684 1727204467.31194: attempt loop complete, returning result 41684 1727204467.31198: _execute() done 41684 1727204467.31201: dumping result to json 41684 1727204467.31203: done dumping result, returning 41684 1727204467.31211: done running TaskExecutor() for managed-node1/TASK: Assert that the route table main contains the specified IPv6 routes [0affcd87-79f5-3839-086d-000000000063] 41684 1727204467.31216: sending task result for task 0affcd87-79f5-3839-086d-000000000063 41684 1727204467.31325: done sending task result for task 0affcd87-79f5-3839-086d-000000000063 41684 1727204467.31327: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 41684 1727204467.31399: no more pending results, returning what we have 41684 1727204467.31403: results queue empty 41684 1727204467.31404: checking for any_errors_fatal 41684 1727204467.31412: done checking for any_errors_fatal 41684 1727204467.31412: checking for max_fail_percentage 41684 1727204467.31415: done checking for max_fail_percentage 41684 1727204467.31415: checking to see if all hosts have failed and the running result is not ok 41684 1727204467.31416: done checking to see if all hosts have failed 41684 1727204467.31417: getting the remaining hosts for this loop 41684 1727204467.31419: done getting the remaining hosts for this loop 41684 1727204467.31423: getting the next task for host managed-node1 41684 1727204467.31431: done getting next task for host managed-node1 41684 1727204467.31434: ^ task is: TASK: Get the interface1 MAC address 41684 1727204467.31436: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204467.31439: getting variables 41684 1727204467.31441: in VariableManager get_vars() 41684 1727204467.31492: Calling all_inventory to load vars for managed-node1 41684 1727204467.31495: Calling groups_inventory to load vars for managed-node1 41684 1727204467.31498: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204467.31509: Calling all_plugins_play to load vars for managed-node1 41684 1727204467.31512: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204467.31515: Calling groups_plugins_play to load vars for managed-node1 41684 1727204467.34003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204467.36382: done with get_vars() 41684 1727204467.37205: done getting variables 41684 1727204467.37288: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the interface1 MAC address] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:99 Tuesday 24 September 2024 15:01:07 -0400 (0:00:00.099) 0:00:23.774 ***** 41684 1727204467.37321: entering _queue_task() for managed-node1/command 41684 1727204467.38106: worker is 1 (out of 1 available) 41684 1727204467.38120: exiting _queue_task() for managed-node1/command 41684 1727204467.38133: done queuing things up, now waiting for results queue to drain 41684 1727204467.38134: waiting for pending results... 41684 1727204467.38603: running TaskExecutor() for managed-node1/TASK: Get the interface1 MAC address 41684 1727204467.38731: in run() - task 0affcd87-79f5-3839-086d-000000000064 41684 1727204467.38754: variable 'ansible_search_path' from source: unknown 41684 1727204467.38878: calling self._execute() 41684 1727204467.39054: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204467.39058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204467.39061: variable 'omit' from source: magic vars 41684 1727204467.39475: variable 'ansible_distribution_major_version' from source: facts 41684 1727204467.39489: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204467.39496: variable 'omit' from source: magic vars 41684 1727204467.39519: variable 'omit' from source: magic vars 41684 1727204467.39657: variable 'interface1' from source: play vars 41684 1727204467.39661: variable 'omit' from source: magic vars 41684 1727204467.39725: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204467.39752: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204467.39778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204467.39800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204467.39811: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204467.39840: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204467.39844: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204467.39851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204467.39976: Set connection var ansible_connection to ssh 41684 1727204467.39980: Set connection var ansible_pipelining to False 41684 1727204467.39982: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204467.39989: Set connection var ansible_timeout to 10 41684 1727204467.39997: Set connection var ansible_shell_executable to /bin/sh 41684 1727204467.39999: Set connection var ansible_shell_type to sh 41684 1727204467.40029: variable 'ansible_shell_executable' from source: unknown 41684 1727204467.40032: variable 'ansible_connection' from source: unknown 41684 1727204467.40035: variable 'ansible_module_compression' from source: unknown 41684 1727204467.40037: variable 'ansible_shell_type' from source: unknown 41684 1727204467.40039: variable 'ansible_shell_executable' from source: unknown 41684 1727204467.40043: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204467.40045: variable 'ansible_pipelining' from source: unknown 41684 1727204467.40049: variable 'ansible_timeout' from source: unknown 41684 1727204467.40053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204467.40212: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204467.40227: variable 'omit' from source: magic vars 41684 1727204467.40233: starting attempt loop 41684 1727204467.40236: running the handler 41684 1727204467.40253: _low_level_execute_command(): starting 41684 1727204467.40260: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204467.41185: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204467.41197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204467.41208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204467.41223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204467.41274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204467.41282: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204467.41292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204467.41305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204467.41314: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204467.41320: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204467.41328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204467.41338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204467.41353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204467.41361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204467.41374: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204467.41388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204467.41468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204467.41492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204467.41507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204467.41595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204467.43135: stdout chunk (state=3): >>>/root <<< 41684 1727204467.43270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204467.43304: stderr chunk (state=3): >>><<< 41684 1727204467.43308: stdout chunk (state=3): >>><<< 41684 1727204467.43331: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204467.43342: _low_level_execute_command(): starting 41684 1727204467.43348: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204467.4333026-43588-280171234434717 `" && echo ansible-tmp-1727204467.4333026-43588-280171234434717="` echo /root/.ansible/tmp/ansible-tmp-1727204467.4333026-43588-280171234434717 `" ) && sleep 0' 41684 1727204467.43798: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204467.43804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204467.43853: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204467.43857: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204467.43860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204467.43906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204467.43916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204467.43987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204467.45819: stdout chunk (state=3): >>>ansible-tmp-1727204467.4333026-43588-280171234434717=/root/.ansible/tmp/ansible-tmp-1727204467.4333026-43588-280171234434717 <<< 41684 1727204467.45929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204467.45988: stderr chunk (state=3): >>><<< 41684 1727204467.45991: stdout chunk (state=3): >>><<< 41684 1727204467.46007: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204467.4333026-43588-280171234434717=/root/.ansible/tmp/ansible-tmp-1727204467.4333026-43588-280171234434717 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204467.46033: variable 'ansible_module_compression' from source: unknown 41684 1727204467.46106: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204467.46142: variable 'ansible_facts' from source: unknown 41684 1727204467.46243: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204467.4333026-43588-280171234434717/AnsiballZ_command.py 41684 1727204467.46397: Sending initial data 41684 1727204467.46405: Sent initial data (156 bytes) 41684 1727204467.47083: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204467.47086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204467.47121: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204467.47125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204467.47127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204467.47232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204467.47287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204467.48974: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204467.49021: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204467.49079: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmp8_5snu6o /root/.ansible/tmp/ansible-tmp-1727204467.4333026-43588-280171234434717/AnsiballZ_command.py <<< 41684 1727204467.49129: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204467.49966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204467.50089: stderr chunk (state=3): >>><<< 41684 1727204467.50092: stdout chunk (state=3): >>><<< 41684 1727204467.50109: done transferring module to remote 41684 1727204467.50118: _low_level_execute_command(): starting 41684 1727204467.50124: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204467.4333026-43588-280171234434717/ /root/.ansible/tmp/ansible-tmp-1727204467.4333026-43588-280171234434717/AnsiballZ_command.py && sleep 0' 41684 1727204467.50592: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204467.50596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204467.50651: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204467.50654: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204467.50657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204467.50659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204467.50669: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204467.50709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204467.50714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204467.50789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204467.52501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204467.52560: stderr chunk (state=3): >>><<< 41684 1727204467.52568: stdout chunk (state=3): >>><<< 41684 1727204467.52586: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204467.52589: _low_level_execute_command(): starting 41684 1727204467.52594: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204467.4333026-43588-280171234434717/AnsiballZ_command.py && sleep 0' 41684 1727204467.53073: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204467.53080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204467.53115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204467.53128: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204467.53181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204467.53193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204467.53257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204468.66661: stdout chunk (state=3): >>> {"changed": true, "stdout": "96:07:24:63:96:ac", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/ethtest1/address"], "start": "2024-09-24 15:01:07.661946", "end": "2024-09-24 15:01:08.665662", "delta": "0:00:01.003716", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/ethtest1/address", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204468.68098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204468.68102: stdout chunk (state=3): >>><<< 41684 1727204468.68105: stderr chunk (state=3): >>><<< 41684 1727204468.68248: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "96:07:24:63:96:ac", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/ethtest1/address"], "start": "2024-09-24 15:01:07.661946", "end": "2024-09-24 15:01:08.665662", "delta": "0:00:01.003716", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/ethtest1/address", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204468.68252: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/ethtest1/address', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204467.4333026-43588-280171234434717/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204468.68255: _low_level_execute_command(): starting 41684 1727204468.68257: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204467.4333026-43588-280171234434717/ > /dev/null 2>&1 && sleep 0' 41684 1727204468.70282: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204468.71290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204468.71310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204468.71330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204468.71381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204468.71395: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204468.71410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204468.71434: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204468.71447: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204468.71459: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204468.71473: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204468.71487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204468.71504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204468.71516: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204468.71528: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204468.71541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204468.71620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204468.71637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204468.71652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204468.72275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204468.74074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204468.74094: stdout chunk (state=3): >>><<< 41684 1727204468.74097: stderr chunk (state=3): >>><<< 41684 1727204468.74273: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204468.74276: handler run complete 41684 1727204468.74279: Evaluated conditional (False): False 41684 1727204468.74281: attempt loop complete, returning result 41684 1727204468.74283: _execute() done 41684 1727204468.74285: dumping result to json 41684 1727204468.74287: done dumping result, returning 41684 1727204468.74289: done running TaskExecutor() for managed-node1/TASK: Get the interface1 MAC address [0affcd87-79f5-3839-086d-000000000064] 41684 1727204468.74291: sending task result for task 0affcd87-79f5-3839-086d-000000000064 ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/sys/class/net/ethtest1/address" ], "delta": "0:00:01.003716", "end": "2024-09-24 15:01:08.665662", "rc": 0, "start": "2024-09-24 15:01:07.661946" } STDOUT: 96:07:24:63:96:ac 41684 1727204468.74434: no more pending results, returning what we have 41684 1727204468.74438: results queue empty 41684 1727204468.74439: checking for any_errors_fatal 41684 1727204468.74448: done checking for any_errors_fatal 41684 1727204468.74449: checking for max_fail_percentage 41684 1727204468.74450: done checking for max_fail_percentage 41684 1727204468.74451: checking to see if all hosts have failed and the running result is not ok 41684 1727204468.74452: done checking to see if all hosts have failed 41684 1727204468.74453: getting the remaining hosts for this loop 41684 1727204468.74454: done getting the remaining hosts for this loop 41684 1727204468.74458: getting the next task for host managed-node1 41684 1727204468.74471: done getting next task for host managed-node1 41684 1727204468.74477: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41684 1727204468.74480: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204468.74498: getting variables 41684 1727204468.74500: in VariableManager get_vars() 41684 1727204468.74544: Calling all_inventory to load vars for managed-node1 41684 1727204468.74547: Calling groups_inventory to load vars for managed-node1 41684 1727204468.74549: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204468.74558: Calling all_plugins_play to load vars for managed-node1 41684 1727204468.74560: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204468.74566: Calling groups_plugins_play to load vars for managed-node1 41684 1727204468.75672: done sending task result for task 0affcd87-79f5-3839-086d-000000000064 41684 1727204468.75676: WORKER PROCESS EXITING 41684 1727204468.77792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204468.81788: done with get_vars() 41684 1727204468.81818: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:01:08 -0400 (0:00:01.446) 0:00:25.220 ***** 41684 1727204468.81928: entering _queue_task() for managed-node1/include_tasks 41684 1727204468.83806: worker is 1 (out of 1 available) 41684 1727204468.83815: exiting _queue_task() for managed-node1/include_tasks 41684 1727204468.83989: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41684 1727204468.83995: in run() - task 0affcd87-79f5-3839-086d-00000000006c 41684 1727204468.83998: variable 'ansible_search_path' from source: unknown 41684 1727204468.84001: variable 'ansible_search_path' from source: unknown 41684 1727204468.84003: calling self._execute() 41684 1727204468.84055: done queuing things up, now waiting for results queue to drain 41684 1727204468.84058: waiting for pending results... 41684 1727204468.84089: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204468.84180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204468.84193: variable 'omit' from source: magic vars 41684 1727204468.85450: variable 'ansible_distribution_major_version' from source: facts 41684 1727204468.85476: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204468.85491: _execute() done 41684 1727204468.85500: dumping result to json 41684 1727204468.85507: done dumping result, returning 41684 1727204468.85519: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-3839-086d-00000000006c] 41684 1727204468.85531: sending task result for task 0affcd87-79f5-3839-086d-00000000006c 41684 1727204468.85694: no more pending results, returning what we have 41684 1727204468.85700: in VariableManager get_vars() 41684 1727204468.85754: Calling all_inventory to load vars for managed-node1 41684 1727204468.85757: Calling groups_inventory to load vars for managed-node1 41684 1727204468.85760: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204468.85778: Calling all_plugins_play to load vars for managed-node1 41684 1727204468.85782: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204468.85785: Calling groups_plugins_play to load vars for managed-node1 41684 1727204468.87393: done sending task result for task 0affcd87-79f5-3839-086d-00000000006c 41684 1727204468.87397: WORKER PROCESS EXITING 41684 1727204468.87932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204468.91829: done with get_vars() 41684 1727204468.91861: variable 'ansible_search_path' from source: unknown 41684 1727204468.91866: variable 'ansible_search_path' from source: unknown 41684 1727204468.91912: we have included files to process 41684 1727204468.91913: generating all_blocks data 41684 1727204468.91916: done generating all_blocks data 41684 1727204468.91923: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41684 1727204468.91924: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41684 1727204468.91926: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41684 1727204468.93233: done processing included file 41684 1727204468.93235: iterating over new_blocks loaded from include file 41684 1727204468.93237: in VariableManager get_vars() 41684 1727204468.93269: done with get_vars() 41684 1727204468.93272: filtering new block on tags 41684 1727204468.93291: done filtering new block on tags 41684 1727204468.93294: in VariableManager get_vars() 41684 1727204468.93318: done with get_vars() 41684 1727204468.93320: filtering new block on tags 41684 1727204468.93342: done filtering new block on tags 41684 1727204468.93344: in VariableManager get_vars() 41684 1727204468.93371: done with get_vars() 41684 1727204468.93373: filtering new block on tags 41684 1727204468.93391: done filtering new block on tags 41684 1727204468.93393: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 41684 1727204468.93399: extending task lists for all hosts with included blocks 41684 1727204468.95413: done extending task lists 41684 1727204468.95415: done processing included files 41684 1727204468.95416: results queue empty 41684 1727204468.95417: checking for any_errors_fatal 41684 1727204468.95422: done checking for any_errors_fatal 41684 1727204468.95422: checking for max_fail_percentage 41684 1727204468.95423: done checking for max_fail_percentage 41684 1727204468.95424: checking to see if all hosts have failed and the running result is not ok 41684 1727204468.95425: done checking to see if all hosts have failed 41684 1727204468.95426: getting the remaining hosts for this loop 41684 1727204468.95427: done getting the remaining hosts for this loop 41684 1727204468.95430: getting the next task for host managed-node1 41684 1727204468.95434: done getting next task for host managed-node1 41684 1727204468.95438: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41684 1727204468.95442: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204468.95454: getting variables 41684 1727204468.95455: in VariableManager get_vars() 41684 1727204468.95482: Calling all_inventory to load vars for managed-node1 41684 1727204468.95485: Calling groups_inventory to load vars for managed-node1 41684 1727204468.95487: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204468.95493: Calling all_plugins_play to load vars for managed-node1 41684 1727204468.95495: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204468.95498: Calling groups_plugins_play to load vars for managed-node1 41684 1727204468.98480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204469.02536: done with get_vars() 41684 1727204469.02575: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:01:09 -0400 (0:00:00.207) 0:00:25.428 ***** 41684 1727204469.02659: entering _queue_task() for managed-node1/setup 41684 1727204469.03713: worker is 1 (out of 1 available) 41684 1727204469.03725: exiting _queue_task() for managed-node1/setup 41684 1727204469.03738: done queuing things up, now waiting for results queue to drain 41684 1727204469.03739: waiting for pending results... 41684 1727204469.04612: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41684 1727204469.04983: in run() - task 0affcd87-79f5-3839-086d-000000000563 41684 1727204469.05001: variable 'ansible_search_path' from source: unknown 41684 1727204469.05005: variable 'ansible_search_path' from source: unknown 41684 1727204469.05041: calling self._execute() 41684 1727204469.05250: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204469.05255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204469.05269: variable 'omit' from source: magic vars 41684 1727204469.06333: variable 'ansible_distribution_major_version' from source: facts 41684 1727204469.06345: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204469.06857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204469.12367: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204469.12552: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204469.12685: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204469.12836: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204469.12869: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204469.13175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204469.13247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204469.13288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204469.13328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204469.13464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204469.13514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204469.13537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204469.13680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204469.13720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204469.13734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204469.14113: variable '__network_required_facts' from source: role '' defaults 41684 1727204469.14123: variable 'ansible_facts' from source: unknown 41684 1727204469.16103: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 41684 1727204469.16109: when evaluation is False, skipping this task 41684 1727204469.16111: _execute() done 41684 1727204469.16113: dumping result to json 41684 1727204469.16116: done dumping result, returning 41684 1727204469.16118: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-3839-086d-000000000563] 41684 1727204469.16121: sending task result for task 0affcd87-79f5-3839-086d-000000000563 41684 1727204469.16199: done sending task result for task 0affcd87-79f5-3839-086d-000000000563 41684 1727204469.16204: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41684 1727204469.16245: no more pending results, returning what we have 41684 1727204469.16250: results queue empty 41684 1727204469.16251: checking for any_errors_fatal 41684 1727204469.16253: done checking for any_errors_fatal 41684 1727204469.16253: checking for max_fail_percentage 41684 1727204469.16255: done checking for max_fail_percentage 41684 1727204469.16256: checking to see if all hosts have failed and the running result is not ok 41684 1727204469.16257: done checking to see if all hosts have failed 41684 1727204469.16258: getting the remaining hosts for this loop 41684 1727204469.16259: done getting the remaining hosts for this loop 41684 1727204469.16267: getting the next task for host managed-node1 41684 1727204469.16277: done getting next task for host managed-node1 41684 1727204469.16281: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 41684 1727204469.16286: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204469.16304: getting variables 41684 1727204469.16306: in VariableManager get_vars() 41684 1727204469.16350: Calling all_inventory to load vars for managed-node1 41684 1727204469.16353: Calling groups_inventory to load vars for managed-node1 41684 1727204469.16356: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204469.16387: Calling all_plugins_play to load vars for managed-node1 41684 1727204469.16391: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204469.16395: Calling groups_plugins_play to load vars for managed-node1 41684 1727204469.19189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204469.23169: done with get_vars() 41684 1727204469.23206: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:01:09 -0400 (0:00:00.213) 0:00:25.641 ***** 41684 1727204469.24032: entering _queue_task() for managed-node1/stat 41684 1727204469.24391: worker is 1 (out of 1 available) 41684 1727204469.24404: exiting _queue_task() for managed-node1/stat 41684 1727204469.24417: done queuing things up, now waiting for results queue to drain 41684 1727204469.24418: waiting for pending results... 41684 1727204469.25373: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 41684 1727204469.25752: in run() - task 0affcd87-79f5-3839-086d-000000000565 41684 1727204469.25769: variable 'ansible_search_path' from source: unknown 41684 1727204469.25773: variable 'ansible_search_path' from source: unknown 41684 1727204469.25810: calling self._execute() 41684 1727204469.26020: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204469.26024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204469.26034: variable 'omit' from source: magic vars 41684 1727204469.26870: variable 'ansible_distribution_major_version' from source: facts 41684 1727204469.26883: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204469.27290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204469.27902: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204469.27951: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204469.27985: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204469.28136: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204469.28220: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204469.28367: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204469.28392: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204469.28420: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204469.28629: variable '__network_is_ostree' from source: set_fact 41684 1727204469.28637: Evaluated conditional (not __network_is_ostree is defined): False 41684 1727204469.28639: when evaluation is False, skipping this task 41684 1727204469.28642: _execute() done 41684 1727204469.28644: dumping result to json 41684 1727204469.28648: done dumping result, returning 41684 1727204469.28772: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-3839-086d-000000000565] 41684 1727204469.28784: sending task result for task 0affcd87-79f5-3839-086d-000000000565 41684 1727204469.28883: done sending task result for task 0affcd87-79f5-3839-086d-000000000565 41684 1727204469.28887: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41684 1727204469.28939: no more pending results, returning what we have 41684 1727204469.28943: results queue empty 41684 1727204469.28944: checking for any_errors_fatal 41684 1727204469.28952: done checking for any_errors_fatal 41684 1727204469.28953: checking for max_fail_percentage 41684 1727204469.28955: done checking for max_fail_percentage 41684 1727204469.28956: checking to see if all hosts have failed and the running result is not ok 41684 1727204469.28957: done checking to see if all hosts have failed 41684 1727204469.28958: getting the remaining hosts for this loop 41684 1727204469.28959: done getting the remaining hosts for this loop 41684 1727204469.28969: getting the next task for host managed-node1 41684 1727204469.28977: done getting next task for host managed-node1 41684 1727204469.28982: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41684 1727204469.28987: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204469.29004: getting variables 41684 1727204469.29005: in VariableManager get_vars() 41684 1727204469.29047: Calling all_inventory to load vars for managed-node1 41684 1727204469.29050: Calling groups_inventory to load vars for managed-node1 41684 1727204469.29051: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204469.29067: Calling all_plugins_play to load vars for managed-node1 41684 1727204469.29069: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204469.29072: Calling groups_plugins_play to load vars for managed-node1 41684 1727204469.31324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204469.35004: done with get_vars() 41684 1727204469.35039: done getting variables 41684 1727204469.35108: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:01:09 -0400 (0:00:00.111) 0:00:25.753 ***** 41684 1727204469.35147: entering _queue_task() for managed-node1/set_fact 41684 1727204469.36377: worker is 1 (out of 1 available) 41684 1727204469.36390: exiting _queue_task() for managed-node1/set_fact 41684 1727204469.36405: done queuing things up, now waiting for results queue to drain 41684 1727204469.36406: waiting for pending results... 41684 1727204469.37196: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41684 1727204469.37471: in run() - task 0affcd87-79f5-3839-086d-000000000566 41684 1727204469.37603: variable 'ansible_search_path' from source: unknown 41684 1727204469.37608: variable 'ansible_search_path' from source: unknown 41684 1727204469.37644: calling self._execute() 41684 1727204469.37931: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204469.37935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204469.37951: variable 'omit' from source: magic vars 41684 1727204469.38798: variable 'ansible_distribution_major_version' from source: facts 41684 1727204469.38811: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204469.39069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204469.39613: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204469.39773: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204469.39809: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204469.39839: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204469.40044: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204469.40192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204469.40236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204469.40260: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204469.40356: variable '__network_is_ostree' from source: set_fact 41684 1727204469.40365: Evaluated conditional (not __network_is_ostree is defined): False 41684 1727204469.40368: when evaluation is False, skipping this task 41684 1727204469.40371: _execute() done 41684 1727204469.40373: dumping result to json 41684 1727204469.40376: done dumping result, returning 41684 1727204469.40384: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-3839-086d-000000000566] 41684 1727204469.40390: sending task result for task 0affcd87-79f5-3839-086d-000000000566 41684 1727204469.40489: done sending task result for task 0affcd87-79f5-3839-086d-000000000566 41684 1727204469.40492: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41684 1727204469.40540: no more pending results, returning what we have 41684 1727204469.40545: results queue empty 41684 1727204469.40546: checking for any_errors_fatal 41684 1727204469.40553: done checking for any_errors_fatal 41684 1727204469.40554: checking for max_fail_percentage 41684 1727204469.40555: done checking for max_fail_percentage 41684 1727204469.40556: checking to see if all hosts have failed and the running result is not ok 41684 1727204469.40557: done checking to see if all hosts have failed 41684 1727204469.40558: getting the remaining hosts for this loop 41684 1727204469.40560: done getting the remaining hosts for this loop 41684 1727204469.40567: getting the next task for host managed-node1 41684 1727204469.40577: done getting next task for host managed-node1 41684 1727204469.40581: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 41684 1727204469.40585: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204469.40603: getting variables 41684 1727204469.40605: in VariableManager get_vars() 41684 1727204469.40646: Calling all_inventory to load vars for managed-node1 41684 1727204469.40649: Calling groups_inventory to load vars for managed-node1 41684 1727204469.40651: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204469.40660: Calling all_plugins_play to load vars for managed-node1 41684 1727204469.40666: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204469.40669: Calling groups_plugins_play to load vars for managed-node1 41684 1727204469.42521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204469.45118: done with get_vars() 41684 1727204469.45147: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:01:09 -0400 (0:00:00.102) 0:00:25.855 ***** 41684 1727204469.45374: entering _queue_task() for managed-node1/service_facts 41684 1727204469.46044: worker is 1 (out of 1 available) 41684 1727204469.46058: exiting _queue_task() for managed-node1/service_facts 41684 1727204469.46117: done queuing things up, now waiting for results queue to drain 41684 1727204469.46119: waiting for pending results... 41684 1727204469.46440: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 41684 1727204469.46617: in run() - task 0affcd87-79f5-3839-086d-000000000568 41684 1727204469.46638: variable 'ansible_search_path' from source: unknown 41684 1727204469.46646: variable 'ansible_search_path' from source: unknown 41684 1727204469.46707: calling self._execute() 41684 1727204469.46814: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204469.46825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204469.46838: variable 'omit' from source: magic vars 41684 1727204469.47247: variable 'ansible_distribution_major_version' from source: facts 41684 1727204469.47270: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204469.47283: variable 'omit' from source: magic vars 41684 1727204469.47367: variable 'omit' from source: magic vars 41684 1727204469.47406: variable 'omit' from source: magic vars 41684 1727204469.47466: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204469.47511: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204469.47541: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204469.47576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204469.47593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204469.47630: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204469.47639: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204469.47649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204469.47779: Set connection var ansible_connection to ssh 41684 1727204469.47793: Set connection var ansible_pipelining to False 41684 1727204469.47804: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204469.47815: Set connection var ansible_timeout to 10 41684 1727204469.47827: Set connection var ansible_shell_executable to /bin/sh 41684 1727204469.47834: Set connection var ansible_shell_type to sh 41684 1727204469.47868: variable 'ansible_shell_executable' from source: unknown 41684 1727204469.47883: variable 'ansible_connection' from source: unknown 41684 1727204469.47892: variable 'ansible_module_compression' from source: unknown 41684 1727204469.47898: variable 'ansible_shell_type' from source: unknown 41684 1727204469.47904: variable 'ansible_shell_executable' from source: unknown 41684 1727204469.47911: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204469.47918: variable 'ansible_pipelining' from source: unknown 41684 1727204469.47924: variable 'ansible_timeout' from source: unknown 41684 1727204469.47931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204469.48184: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204469.48200: variable 'omit' from source: magic vars 41684 1727204469.48217: starting attempt loop 41684 1727204469.48224: running the handler 41684 1727204469.48241: _low_level_execute_command(): starting 41684 1727204469.48254: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204469.49037: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204469.49053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204469.49077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204469.49104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204469.49148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204469.49166: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204469.49182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204469.49208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204469.49221: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204469.49231: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204469.49244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204469.49259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204469.49280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204469.49292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204469.49304: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204469.49323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204469.49398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204469.49422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204469.49445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204469.49543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204469.51151: stdout chunk (state=3): >>>/root <<< 41684 1727204469.51348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204469.51352: stdout chunk (state=3): >>><<< 41684 1727204469.51355: stderr chunk (state=3): >>><<< 41684 1727204469.51477: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204469.51480: _low_level_execute_command(): starting 41684 1727204469.51484: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204469.513783-43760-276521896180204 `" && echo ansible-tmp-1727204469.513783-43760-276521896180204="` echo /root/.ansible/tmp/ansible-tmp-1727204469.513783-43760-276521896180204 `" ) && sleep 0' 41684 1727204469.52845: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204469.52860: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204469.52880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204469.52898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204469.52941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204469.52953: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204469.52969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204469.52987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204469.52998: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204469.53008: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204469.53018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204469.53032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204469.53048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204469.53059: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204469.53072: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204469.53085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204469.53159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204469.53186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204469.53204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204469.53293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204469.55136: stdout chunk (state=3): >>>ansible-tmp-1727204469.513783-43760-276521896180204=/root/.ansible/tmp/ansible-tmp-1727204469.513783-43760-276521896180204 <<< 41684 1727204469.55683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204469.55766: stderr chunk (state=3): >>><<< 41684 1727204469.55770: stdout chunk (state=3): >>><<< 41684 1727204469.55977: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204469.513783-43760-276521896180204=/root/.ansible/tmp/ansible-tmp-1727204469.513783-43760-276521896180204 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204469.55981: variable 'ansible_module_compression' from source: unknown 41684 1727204469.55983: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 41684 1727204469.55986: variable 'ansible_facts' from source: unknown 41684 1727204469.56033: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204469.513783-43760-276521896180204/AnsiballZ_service_facts.py 41684 1727204469.56285: Sending initial data 41684 1727204469.56305: Sent initial data (161 bytes) 41684 1727204469.58028: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204469.58127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204469.58158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204469.58215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204469.58289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204469.58307: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204469.58320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204469.58336: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204469.58346: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204469.58357: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204469.58387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204469.58403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204469.58417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204469.58429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204469.58439: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204469.58452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204469.58536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204469.58557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204469.58577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204469.58662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204469.60356: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204469.60406: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204469.60465: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmp37d6v4uu /root/.ansible/tmp/ansible-tmp-1727204469.513783-43760-276521896180204/AnsiballZ_service_facts.py <<< 41684 1727204469.60515: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204469.61851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204469.61919: stderr chunk (state=3): >>><<< 41684 1727204469.61923: stdout chunk (state=3): >>><<< 41684 1727204469.61942: done transferring module to remote 41684 1727204469.61957: _low_level_execute_command(): starting 41684 1727204469.61962: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204469.513783-43760-276521896180204/ /root/.ansible/tmp/ansible-tmp-1727204469.513783-43760-276521896180204/AnsiballZ_service_facts.py && sleep 0' 41684 1727204469.62618: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204469.62627: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204469.62638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204469.62651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204469.62695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204469.62702: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204469.62715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204469.62725: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204469.62731: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204469.62738: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204469.62745: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204469.62754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204469.62770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204469.62778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204469.62785: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204469.62794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204469.62868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204469.62884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204469.62894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204469.62975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204469.64758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204469.64828: stderr chunk (state=3): >>><<< 41684 1727204469.64831: stdout chunk (state=3): >>><<< 41684 1727204469.64878: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204469.64898: _low_level_execute_command(): starting 41684 1727204469.64967: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204469.513783-43760-276521896180204/AnsiballZ_service_facts.py && sleep 0' 41684 1727204469.65891: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204469.65907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204469.65936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204469.65971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204469.66026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204469.66059: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204469.66085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204469.66103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204469.66131: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204469.66150: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204469.66184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204469.66212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204469.66241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204469.66283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204469.66306: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204469.66328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204469.66452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204469.66503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204469.66532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204469.66667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204470.94545: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 41684 1727204470.94624: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 41684 1727204470.95891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204470.95973: stderr chunk (state=3): >>><<< 41684 1727204470.95976: stdout chunk (state=3): >>><<< 41684 1727204470.96274: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204470.96685: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204469.513783-43760-276521896180204/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204470.96706: _low_level_execute_command(): starting 41684 1727204470.96717: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204469.513783-43760-276521896180204/ > /dev/null 2>&1 && sleep 0' 41684 1727204470.97616: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204470.97634: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204470.97651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204470.97672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204470.97715: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204470.97727: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204470.97745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204470.97765: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204470.97778: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204470.97790: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204470.97802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204470.97816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204470.97832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204470.97845: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204470.97860: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204470.97877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204470.97950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204470.98007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204470.98022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204470.98192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204470.99882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204470.99929: stderr chunk (state=3): >>><<< 41684 1727204470.99932: stdout chunk (state=3): >>><<< 41684 1727204470.99949: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204470.99955: handler run complete 41684 1727204471.00064: variable 'ansible_facts' from source: unknown 41684 1727204471.00161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204471.00403: variable 'ansible_facts' from source: unknown 41684 1727204471.00480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204471.00586: attempt loop complete, returning result 41684 1727204471.00589: _execute() done 41684 1727204471.00592: dumping result to json 41684 1727204471.00628: done dumping result, returning 41684 1727204471.00636: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-3839-086d-000000000568] 41684 1727204471.00642: sending task result for task 0affcd87-79f5-3839-086d-000000000568 41684 1727204471.01143: done sending task result for task 0affcd87-79f5-3839-086d-000000000568 41684 1727204471.01146: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41684 1727204471.01242: no more pending results, returning what we have 41684 1727204471.01245: results queue empty 41684 1727204471.01246: checking for any_errors_fatal 41684 1727204471.01250: done checking for any_errors_fatal 41684 1727204471.01250: checking for max_fail_percentage 41684 1727204471.01252: done checking for max_fail_percentage 41684 1727204471.01253: checking to see if all hosts have failed and the running result is not ok 41684 1727204471.01254: done checking to see if all hosts have failed 41684 1727204471.01254: getting the remaining hosts for this loop 41684 1727204471.01256: done getting the remaining hosts for this loop 41684 1727204471.01259: getting the next task for host managed-node1 41684 1727204471.01269: done getting next task for host managed-node1 41684 1727204471.01274: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 41684 1727204471.01279: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204471.01290: getting variables 41684 1727204471.01292: in VariableManager get_vars() 41684 1727204471.01326: Calling all_inventory to load vars for managed-node1 41684 1727204471.01329: Calling groups_inventory to load vars for managed-node1 41684 1727204471.01331: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204471.01340: Calling all_plugins_play to load vars for managed-node1 41684 1727204471.01343: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204471.01359: Calling groups_plugins_play to load vars for managed-node1 41684 1727204471.03867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204471.05623: done with get_vars() 41684 1727204471.05654: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:01:11 -0400 (0:00:01.603) 0:00:27.459 ***** 41684 1727204471.05762: entering _queue_task() for managed-node1/package_facts 41684 1727204471.06131: worker is 1 (out of 1 available) 41684 1727204471.06145: exiting _queue_task() for managed-node1/package_facts 41684 1727204471.06158: done queuing things up, now waiting for results queue to drain 41684 1727204471.06159: waiting for pending results... 41684 1727204471.06606: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 41684 1727204471.06756: in run() - task 0affcd87-79f5-3839-086d-000000000569 41684 1727204471.06792: variable 'ansible_search_path' from source: unknown 41684 1727204471.06796: variable 'ansible_search_path' from source: unknown 41684 1727204471.06834: calling self._execute() 41684 1727204471.06931: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204471.06937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204471.06946: variable 'omit' from source: magic vars 41684 1727204471.07334: variable 'ansible_distribution_major_version' from source: facts 41684 1727204471.07348: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204471.07354: variable 'omit' from source: magic vars 41684 1727204471.07433: variable 'omit' from source: magic vars 41684 1727204471.07468: variable 'omit' from source: magic vars 41684 1727204471.07515: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204471.07551: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204471.07575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204471.07594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204471.07611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204471.07642: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204471.07645: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204471.07647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204471.07753: Set connection var ansible_connection to ssh 41684 1727204471.07758: Set connection var ansible_pipelining to False 41684 1727204471.07767: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204471.07772: Set connection var ansible_timeout to 10 41684 1727204471.07780: Set connection var ansible_shell_executable to /bin/sh 41684 1727204471.07783: Set connection var ansible_shell_type to sh 41684 1727204471.07807: variable 'ansible_shell_executable' from source: unknown 41684 1727204471.07810: variable 'ansible_connection' from source: unknown 41684 1727204471.07814: variable 'ansible_module_compression' from source: unknown 41684 1727204471.07817: variable 'ansible_shell_type' from source: unknown 41684 1727204471.07819: variable 'ansible_shell_executable' from source: unknown 41684 1727204471.07827: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204471.07832: variable 'ansible_pipelining' from source: unknown 41684 1727204471.07834: variable 'ansible_timeout' from source: unknown 41684 1727204471.07838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204471.08037: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204471.08054: variable 'omit' from source: magic vars 41684 1727204471.08058: starting attempt loop 41684 1727204471.08060: running the handler 41684 1727204471.08077: _low_level_execute_command(): starting 41684 1727204471.08085: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204471.08896: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204471.08911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204471.08928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204471.08943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204471.08983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204471.08990: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204471.09000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204471.09013: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204471.09023: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204471.09035: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204471.09043: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204471.09057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204471.09071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204471.09079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204471.09086: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204471.09095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204471.09176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204471.09191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204471.09200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204471.09288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204471.10824: stdout chunk (state=3): >>>/root <<< 41684 1727204471.11292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204471.11512: stderr chunk (state=3): >>><<< 41684 1727204471.11525: stdout chunk (state=3): >>><<< 41684 1727204471.11659: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204471.11666: _low_level_execute_command(): starting 41684 1727204471.11670: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204471.1155484-43826-95945843935074 `" && echo ansible-tmp-1727204471.1155484-43826-95945843935074="` echo /root/.ansible/tmp/ansible-tmp-1727204471.1155484-43826-95945843935074 `" ) && sleep 0' 41684 1727204471.12436: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204471.12439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204471.12492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204471.12496: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204471.12499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204471.12553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204471.13186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204471.13453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204471.15298: stdout chunk (state=3): >>>ansible-tmp-1727204471.1155484-43826-95945843935074=/root/.ansible/tmp/ansible-tmp-1727204471.1155484-43826-95945843935074 <<< 41684 1727204471.15404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204471.15489: stderr chunk (state=3): >>><<< 41684 1727204471.15492: stdout chunk (state=3): >>><<< 41684 1727204471.15678: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204471.1155484-43826-95945843935074=/root/.ansible/tmp/ansible-tmp-1727204471.1155484-43826-95945843935074 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204471.15681: variable 'ansible_module_compression' from source: unknown 41684 1727204471.15684: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 41684 1727204471.15787: variable 'ansible_facts' from source: unknown 41684 1727204471.15924: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204471.1155484-43826-95945843935074/AnsiballZ_package_facts.py 41684 1727204471.16721: Sending initial data 41684 1727204471.16724: Sent initial data (161 bytes) 41684 1727204471.19566: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204471.19619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204471.19639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204471.19658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204471.19754: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204471.19771: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204471.19786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204471.19804: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204471.19849: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204471.19861: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204471.19878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204471.19892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204471.19906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204471.19921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204471.19931: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204471.19969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204471.20168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204471.20196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204471.20212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204471.20299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204471.22013: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204471.22070: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204471.22126: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmp8zpvp3jz /root/.ansible/tmp/ansible-tmp-1727204471.1155484-43826-95945843935074/AnsiballZ_package_facts.py <<< 41684 1727204471.22191: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204471.25568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204471.25670: stderr chunk (state=3): >>><<< 41684 1727204471.25674: stdout chunk (state=3): >>><<< 41684 1727204471.25677: done transferring module to remote 41684 1727204471.25679: _low_level_execute_command(): starting 41684 1727204471.25681: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204471.1155484-43826-95945843935074/ /root/.ansible/tmp/ansible-tmp-1727204471.1155484-43826-95945843935074/AnsiballZ_package_facts.py && sleep 0' 41684 1727204471.26477: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204471.26494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204471.26521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204471.26545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204471.26598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204471.26611: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204471.26634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204471.26652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204471.26666: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204471.26679: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204471.26690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204471.26702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204471.26715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204471.26726: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204471.26746: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204471.26761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204471.26846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204471.26875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204471.26891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204471.27006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204471.28800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204471.28804: stdout chunk (state=3): >>><<< 41684 1727204471.28806: stderr chunk (state=3): >>><<< 41684 1727204471.28870: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204471.28873: _low_level_execute_command(): starting 41684 1727204471.28876: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204471.1155484-43826-95945843935074/AnsiballZ_package_facts.py && sleep 0' 41684 1727204471.30716: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204471.30720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204471.30722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204471.30757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204471.30761: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204471.30769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204471.30845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204471.30848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204471.30937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204471.76733: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gli<<< 41684 1727204471.76750: stdout chunk (state=3): >>>bc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": <<< 41684 1727204471.76836: stdout chunk (state=3): >>>"53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "rel<<< 41684 1727204471.76847: stdout chunk (state=3): >>>ease": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "rel<<< 41684 1727204471.76854: stdout chunk (state=3): >>>ease": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], <<< 41684 1727204471.76859: stdout chunk (state=3): >>>"slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "r<<< 41684 1727204471.76868: stdout chunk (state=3): >>>elease": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles"<<< 41684 1727204471.76895: stdout chunk (state=3): >>>: [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "<<< 41684 1727204471.76955: stdout chunk (state=3): >>>0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "s<<< 41684 1727204471.76958: stdout chunk (state=3): >>>ource": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 41684 1727204471.76965: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 41684 1727204471.78423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204471.78506: stderr chunk (state=3): >>><<< 41684 1727204471.78509: stdout chunk (state=3): >>><<< 41684 1727204471.78781: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204471.83242: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204471.1155484-43826-95945843935074/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204471.83394: _low_level_execute_command(): starting 41684 1727204471.83439: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204471.1155484-43826-95945843935074/ > /dev/null 2>&1 && sleep 0' 41684 1727204471.84195: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204471.84209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204471.84221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204471.84242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204471.84292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204471.84307: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204471.84321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204471.84338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204471.84357: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204471.84374: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204471.84388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204471.84402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204471.84420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204471.84431: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204471.84443: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204471.84455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204471.84544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204471.84568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204471.84590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204471.84693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204471.86624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204471.86628: stdout chunk (state=3): >>><<< 41684 1727204471.86631: stderr chunk (state=3): >>><<< 41684 1727204471.86876: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204471.86880: handler run complete 41684 1727204471.88415: variable 'ansible_facts' from source: unknown 41684 1727204471.89524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204471.95238: variable 'ansible_facts' from source: unknown 41684 1727204471.95732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204471.96520: attempt loop complete, returning result 41684 1727204471.96531: _execute() done 41684 1727204471.96534: dumping result to json 41684 1727204471.96729: done dumping result, returning 41684 1727204471.96738: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-3839-086d-000000000569] 41684 1727204471.96744: sending task result for task 0affcd87-79f5-3839-086d-000000000569 41684 1727204471.99388: done sending task result for task 0affcd87-79f5-3839-086d-000000000569 41684 1727204471.99392: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41684 1727204471.99550: no more pending results, returning what we have 41684 1727204471.99553: results queue empty 41684 1727204471.99554: checking for any_errors_fatal 41684 1727204471.99565: done checking for any_errors_fatal 41684 1727204471.99566: checking for max_fail_percentage 41684 1727204471.99568: done checking for max_fail_percentage 41684 1727204471.99569: checking to see if all hosts have failed and the running result is not ok 41684 1727204471.99570: done checking to see if all hosts have failed 41684 1727204471.99571: getting the remaining hosts for this loop 41684 1727204471.99572: done getting the remaining hosts for this loop 41684 1727204471.99575: getting the next task for host managed-node1 41684 1727204471.99582: done getting next task for host managed-node1 41684 1727204471.99587: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 41684 1727204471.99590: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204471.99601: getting variables 41684 1727204471.99602: in VariableManager get_vars() 41684 1727204471.99645: Calling all_inventory to load vars for managed-node1 41684 1727204471.99650: Calling groups_inventory to load vars for managed-node1 41684 1727204471.99653: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204471.99666: Calling all_plugins_play to load vars for managed-node1 41684 1727204471.99669: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204471.99673: Calling groups_plugins_play to load vars for managed-node1 41684 1727204472.01067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204472.03014: done with get_vars() 41684 1727204472.03053: done getting variables 41684 1727204472.03121: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:01:12 -0400 (0:00:00.974) 0:00:28.433 ***** 41684 1727204472.03171: entering _queue_task() for managed-node1/debug 41684 1727204472.03595: worker is 1 (out of 1 available) 41684 1727204472.03609: exiting _queue_task() for managed-node1/debug 41684 1727204472.03629: done queuing things up, now waiting for results queue to drain 41684 1727204472.03631: waiting for pending results... 41684 1727204472.03989: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 41684 1727204472.04178: in run() - task 0affcd87-79f5-3839-086d-00000000006d 41684 1727204472.04209: variable 'ansible_search_path' from source: unknown 41684 1727204472.04218: variable 'ansible_search_path' from source: unknown 41684 1727204472.04290: calling self._execute() 41684 1727204472.04400: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204472.04414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204472.04428: variable 'omit' from source: magic vars 41684 1727204472.04871: variable 'ansible_distribution_major_version' from source: facts 41684 1727204472.04905: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204472.04917: variable 'omit' from source: magic vars 41684 1727204472.04983: variable 'omit' from source: magic vars 41684 1727204472.05102: variable 'network_provider' from source: set_fact 41684 1727204472.05130: variable 'omit' from source: magic vars 41684 1727204472.05188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204472.05235: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204472.05267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204472.05293: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204472.05309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204472.05350: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204472.05358: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204472.05371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204472.05488: Set connection var ansible_connection to ssh 41684 1727204472.05504: Set connection var ansible_pipelining to False 41684 1727204472.05515: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204472.05525: Set connection var ansible_timeout to 10 41684 1727204472.05536: Set connection var ansible_shell_executable to /bin/sh 41684 1727204472.05550: Set connection var ansible_shell_type to sh 41684 1727204472.05587: variable 'ansible_shell_executable' from source: unknown 41684 1727204472.05595: variable 'ansible_connection' from source: unknown 41684 1727204472.05606: variable 'ansible_module_compression' from source: unknown 41684 1727204472.05613: variable 'ansible_shell_type' from source: unknown 41684 1727204472.05619: variable 'ansible_shell_executable' from source: unknown 41684 1727204472.05625: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204472.05632: variable 'ansible_pipelining' from source: unknown 41684 1727204472.05639: variable 'ansible_timeout' from source: unknown 41684 1727204472.05649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204472.05836: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204472.05860: variable 'omit' from source: magic vars 41684 1727204472.05888: starting attempt loop 41684 1727204472.05896: running the handler 41684 1727204472.05948: handler run complete 41684 1727204472.05972: attempt loop complete, returning result 41684 1727204472.05984: _execute() done 41684 1727204472.05997: dumping result to json 41684 1727204472.06005: done dumping result, returning 41684 1727204472.06016: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-3839-086d-00000000006d] 41684 1727204472.06027: sending task result for task 0affcd87-79f5-3839-086d-00000000006d ok: [managed-node1] => {} MSG: Using network provider: nm 41684 1727204472.06227: no more pending results, returning what we have 41684 1727204472.06232: results queue empty 41684 1727204472.06233: checking for any_errors_fatal 41684 1727204472.06245: done checking for any_errors_fatal 41684 1727204472.06246: checking for max_fail_percentage 41684 1727204472.06248: done checking for max_fail_percentage 41684 1727204472.06248: checking to see if all hosts have failed and the running result is not ok 41684 1727204472.06249: done checking to see if all hosts have failed 41684 1727204472.06250: getting the remaining hosts for this loop 41684 1727204472.06252: done getting the remaining hosts for this loop 41684 1727204472.06256: getting the next task for host managed-node1 41684 1727204472.06268: done getting next task for host managed-node1 41684 1727204472.06274: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41684 1727204472.06279: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204472.06292: getting variables 41684 1727204472.06294: in VariableManager get_vars() 41684 1727204472.06338: Calling all_inventory to load vars for managed-node1 41684 1727204472.06341: Calling groups_inventory to load vars for managed-node1 41684 1727204472.06344: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204472.06355: Calling all_plugins_play to load vars for managed-node1 41684 1727204472.06358: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204472.06361: Calling groups_plugins_play to load vars for managed-node1 41684 1727204472.08007: done sending task result for task 0affcd87-79f5-3839-086d-00000000006d 41684 1727204472.08011: WORKER PROCESS EXITING 41684 1727204472.10077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204472.13769: done with get_vars() 41684 1727204472.13811: done getting variables 41684 1727204472.13994: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:01:12 -0400 (0:00:00.108) 0:00:28.541 ***** 41684 1727204472.14032: entering _queue_task() for managed-node1/fail 41684 1727204472.14834: worker is 1 (out of 1 available) 41684 1727204472.14846: exiting _queue_task() for managed-node1/fail 41684 1727204472.14859: done queuing things up, now waiting for results queue to drain 41684 1727204472.14860: waiting for pending results... 41684 1727204472.15665: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41684 1727204472.16000: in run() - task 0affcd87-79f5-3839-086d-00000000006e 41684 1727204472.16077: variable 'ansible_search_path' from source: unknown 41684 1727204472.16081: variable 'ansible_search_path' from source: unknown 41684 1727204472.16116: calling self._execute() 41684 1727204472.16326: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204472.16332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204472.16460: variable 'omit' from source: magic vars 41684 1727204472.17134: variable 'ansible_distribution_major_version' from source: facts 41684 1727204472.17147: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204472.17383: variable 'network_state' from source: role '' defaults 41684 1727204472.17394: Evaluated conditional (network_state != {}): False 41684 1727204472.17398: when evaluation is False, skipping this task 41684 1727204472.17401: _execute() done 41684 1727204472.17403: dumping result to json 41684 1727204472.17406: done dumping result, returning 41684 1727204472.17409: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-3839-086d-00000000006e] 41684 1727204472.17416: sending task result for task 0affcd87-79f5-3839-086d-00000000006e 41684 1727204472.17638: done sending task result for task 0affcd87-79f5-3839-086d-00000000006e 41684 1727204472.17643: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41684 1727204472.17696: no more pending results, returning what we have 41684 1727204472.17699: results queue empty 41684 1727204472.17700: checking for any_errors_fatal 41684 1727204472.17708: done checking for any_errors_fatal 41684 1727204472.17709: checking for max_fail_percentage 41684 1727204472.17710: done checking for max_fail_percentage 41684 1727204472.17711: checking to see if all hosts have failed and the running result is not ok 41684 1727204472.17712: done checking to see if all hosts have failed 41684 1727204472.17712: getting the remaining hosts for this loop 41684 1727204472.17714: done getting the remaining hosts for this loop 41684 1727204472.17718: getting the next task for host managed-node1 41684 1727204472.17725: done getting next task for host managed-node1 41684 1727204472.17729: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41684 1727204472.17732: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204472.17753: getting variables 41684 1727204472.17755: in VariableManager get_vars() 41684 1727204472.17802: Calling all_inventory to load vars for managed-node1 41684 1727204472.17805: Calling groups_inventory to load vars for managed-node1 41684 1727204472.17807: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204472.17819: Calling all_plugins_play to load vars for managed-node1 41684 1727204472.17822: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204472.17825: Calling groups_plugins_play to load vars for managed-node1 41684 1727204472.20788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204472.24420: done with get_vars() 41684 1727204472.24459: done getting variables 41684 1727204472.24644: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:01:12 -0400 (0:00:00.106) 0:00:28.648 ***** 41684 1727204472.24805: entering _queue_task() for managed-node1/fail 41684 1727204472.25505: worker is 1 (out of 1 available) 41684 1727204472.25518: exiting _queue_task() for managed-node1/fail 41684 1727204472.25531: done queuing things up, now waiting for results queue to drain 41684 1727204472.25532: waiting for pending results... 41684 1727204472.26459: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41684 1727204472.26607: in run() - task 0affcd87-79f5-3839-086d-00000000006f 41684 1727204472.26622: variable 'ansible_search_path' from source: unknown 41684 1727204472.26626: variable 'ansible_search_path' from source: unknown 41684 1727204472.26672: calling self._execute() 41684 1727204472.26768: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204472.26774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204472.26782: variable 'omit' from source: magic vars 41684 1727204472.27147: variable 'ansible_distribution_major_version' from source: facts 41684 1727204472.27158: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204472.27280: variable 'network_state' from source: role '' defaults 41684 1727204472.27291: Evaluated conditional (network_state != {}): False 41684 1727204472.27295: when evaluation is False, skipping this task 41684 1727204472.27298: _execute() done 41684 1727204472.27300: dumping result to json 41684 1727204472.27302: done dumping result, returning 41684 1727204472.27316: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-3839-086d-00000000006f] 41684 1727204472.27320: sending task result for task 0affcd87-79f5-3839-086d-00000000006f 41684 1727204472.27419: done sending task result for task 0affcd87-79f5-3839-086d-00000000006f 41684 1727204472.27423: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41684 1727204472.27481: no more pending results, returning what we have 41684 1727204472.27485: results queue empty 41684 1727204472.27485: checking for any_errors_fatal 41684 1727204472.27493: done checking for any_errors_fatal 41684 1727204472.27494: checking for max_fail_percentage 41684 1727204472.27496: done checking for max_fail_percentage 41684 1727204472.27496: checking to see if all hosts have failed and the running result is not ok 41684 1727204472.27497: done checking to see if all hosts have failed 41684 1727204472.27498: getting the remaining hosts for this loop 41684 1727204472.27500: done getting the remaining hosts for this loop 41684 1727204472.27504: getting the next task for host managed-node1 41684 1727204472.27511: done getting next task for host managed-node1 41684 1727204472.27515: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41684 1727204472.27518: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204472.27540: getting variables 41684 1727204472.27544: in VariableManager get_vars() 41684 1727204472.27585: Calling all_inventory to load vars for managed-node1 41684 1727204472.27588: Calling groups_inventory to load vars for managed-node1 41684 1727204472.27590: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204472.27600: Calling all_plugins_play to load vars for managed-node1 41684 1727204472.27602: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204472.27605: Calling groups_plugins_play to load vars for managed-node1 41684 1727204472.29068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204472.30837: done with get_vars() 41684 1727204472.30873: done getting variables 41684 1727204472.30945: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:01:12 -0400 (0:00:00.061) 0:00:28.711 ***** 41684 1727204472.30984: entering _queue_task() for managed-node1/fail 41684 1727204472.31538: worker is 1 (out of 1 available) 41684 1727204472.31550: exiting _queue_task() for managed-node1/fail 41684 1727204472.31565: done queuing things up, now waiting for results queue to drain 41684 1727204472.31687: waiting for pending results... 41684 1727204472.32584: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41684 1727204472.32749: in run() - task 0affcd87-79f5-3839-086d-000000000070 41684 1727204472.32773: variable 'ansible_search_path' from source: unknown 41684 1727204472.32784: variable 'ansible_search_path' from source: unknown 41684 1727204472.32847: calling self._execute() 41684 1727204472.32976: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204472.32990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204472.33008: variable 'omit' from source: magic vars 41684 1727204472.33439: variable 'ansible_distribution_major_version' from source: facts 41684 1727204472.33462: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204472.33696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204472.37440: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204472.37524: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204472.37577: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204472.37625: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204472.37660: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204472.37768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204472.37816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204472.37848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204472.37914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204472.37934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204472.38045: variable 'ansible_distribution_major_version' from source: facts 41684 1727204472.38070: Evaluated conditional (ansible_distribution_major_version | int > 9): False 41684 1727204472.38078: when evaluation is False, skipping this task 41684 1727204472.38087: _execute() done 41684 1727204472.38099: dumping result to json 41684 1727204472.38113: done dumping result, returning 41684 1727204472.38126: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-3839-086d-000000000070] 41684 1727204472.38138: sending task result for task 0affcd87-79f5-3839-086d-000000000070 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 41684 1727204472.38305: no more pending results, returning what we have 41684 1727204472.38309: results queue empty 41684 1727204472.38310: checking for any_errors_fatal 41684 1727204472.38317: done checking for any_errors_fatal 41684 1727204472.38318: checking for max_fail_percentage 41684 1727204472.38320: done checking for max_fail_percentage 41684 1727204472.38321: checking to see if all hosts have failed and the running result is not ok 41684 1727204472.38322: done checking to see if all hosts have failed 41684 1727204472.38323: getting the remaining hosts for this loop 41684 1727204472.38325: done getting the remaining hosts for this loop 41684 1727204472.38329: getting the next task for host managed-node1 41684 1727204472.38337: done getting next task for host managed-node1 41684 1727204472.38341: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41684 1727204472.38345: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204472.38367: getting variables 41684 1727204472.38369: in VariableManager get_vars() 41684 1727204472.38413: Calling all_inventory to load vars for managed-node1 41684 1727204472.38417: Calling groups_inventory to load vars for managed-node1 41684 1727204472.38420: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204472.38431: Calling all_plugins_play to load vars for managed-node1 41684 1727204472.38434: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204472.38437: Calling groups_plugins_play to load vars for managed-node1 41684 1727204472.39524: done sending task result for task 0affcd87-79f5-3839-086d-000000000070 41684 1727204472.39527: WORKER PROCESS EXITING 41684 1727204472.41787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204472.44033: done with get_vars() 41684 1727204472.44105: done getting variables 41684 1727204472.44225: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:01:12 -0400 (0:00:00.133) 0:00:28.844 ***** 41684 1727204472.44295: entering _queue_task() for managed-node1/dnf 41684 1727204472.44825: worker is 1 (out of 1 available) 41684 1727204472.44844: exiting _queue_task() for managed-node1/dnf 41684 1727204472.44856: done queuing things up, now waiting for results queue to drain 41684 1727204472.44858: waiting for pending results... 41684 1727204472.45328: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41684 1727204472.45561: in run() - task 0affcd87-79f5-3839-086d-000000000071 41684 1727204472.45572: variable 'ansible_search_path' from source: unknown 41684 1727204472.45638: variable 'ansible_search_path' from source: unknown 41684 1727204472.45659: calling self._execute() 41684 1727204472.45851: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204472.45869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204472.45888: variable 'omit' from source: magic vars 41684 1727204472.46590: variable 'ansible_distribution_major_version' from source: facts 41684 1727204472.46604: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204472.47045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204472.49806: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204472.49883: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204472.49919: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204472.49961: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204472.49992: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204472.50080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204472.50113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204472.50140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204472.50194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204472.50208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204472.50337: variable 'ansible_distribution' from source: facts 41684 1727204472.50341: variable 'ansible_distribution_major_version' from source: facts 41684 1727204472.50358: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 41684 1727204472.50513: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204472.50659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204472.50690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204472.50725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204472.50770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204472.50787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204472.50836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204472.50858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204472.50887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204472.50935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204472.50950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204472.50995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204472.51019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204472.51052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204472.51097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204472.51112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204472.51295: variable 'network_connections' from source: task vars 41684 1727204472.51307: variable 'interface1' from source: play vars 41684 1727204472.51389: variable 'interface1' from source: play vars 41684 1727204472.51478: variable 'interface1_mac' from source: set_fact 41684 1727204472.51559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204472.51761: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204472.51807: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204472.51839: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204472.51870: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204472.51919: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204472.51999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204472.52076: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204472.52115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204472.52231: variable '__network_team_connections_defined' from source: role '' defaults 41684 1727204472.52690: variable 'network_connections' from source: task vars 41684 1727204472.52700: variable 'interface1' from source: play vars 41684 1727204472.52782: variable 'interface1' from source: play vars 41684 1727204472.52928: variable 'interface1_mac' from source: set_fact 41684 1727204472.53003: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41684 1727204472.53006: when evaluation is False, skipping this task 41684 1727204472.53009: _execute() done 41684 1727204472.53019: dumping result to json 41684 1727204472.53021: done dumping result, returning 41684 1727204472.53029: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-3839-086d-000000000071] 41684 1727204472.53035: sending task result for task 0affcd87-79f5-3839-086d-000000000071 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41684 1727204472.53280: no more pending results, returning what we have 41684 1727204472.53286: results queue empty 41684 1727204472.53287: checking for any_errors_fatal 41684 1727204472.53295: done checking for any_errors_fatal 41684 1727204472.53296: checking for max_fail_percentage 41684 1727204472.53298: done checking for max_fail_percentage 41684 1727204472.53298: checking to see if all hosts have failed and the running result is not ok 41684 1727204472.53299: done checking to see if all hosts have failed 41684 1727204472.53300: getting the remaining hosts for this loop 41684 1727204472.53302: done getting the remaining hosts for this loop 41684 1727204472.53313: getting the next task for host managed-node1 41684 1727204472.53329: done getting next task for host managed-node1 41684 1727204472.53334: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41684 1727204472.53337: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204472.53352: done sending task result for task 0affcd87-79f5-3839-086d-000000000071 41684 1727204472.53384: getting variables 41684 1727204472.53386: in VariableManager get_vars() 41684 1727204472.53447: Calling all_inventory to load vars for managed-node1 41684 1727204472.53450: Calling groups_inventory to load vars for managed-node1 41684 1727204472.53453: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204472.53481: Calling all_plugins_play to load vars for managed-node1 41684 1727204472.53488: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204472.53494: Calling groups_plugins_play to load vars for managed-node1 41684 1727204472.54067: WORKER PROCESS EXITING 41684 1727204472.56423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204472.58668: done with get_vars() 41684 1727204472.58708: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41684 1727204472.58787: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:01:12 -0400 (0:00:00.145) 0:00:28.989 ***** 41684 1727204472.58823: entering _queue_task() for managed-node1/yum 41684 1727204472.59670: worker is 1 (out of 1 available) 41684 1727204472.59681: exiting _queue_task() for managed-node1/yum 41684 1727204472.59695: done queuing things up, now waiting for results queue to drain 41684 1727204472.59696: waiting for pending results... 41684 1727204472.60149: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41684 1727204472.60285: in run() - task 0affcd87-79f5-3839-086d-000000000072 41684 1727204472.60298: variable 'ansible_search_path' from source: unknown 41684 1727204472.60303: variable 'ansible_search_path' from source: unknown 41684 1727204472.60337: calling self._execute() 41684 1727204472.60433: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204472.60439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204472.60450: variable 'omit' from source: magic vars 41684 1727204472.60859: variable 'ansible_distribution_major_version' from source: facts 41684 1727204472.60873: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204472.61060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204472.75167: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204472.75237: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204472.75334: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204472.75396: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204472.75426: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204472.75569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204472.75596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204472.75623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204472.75701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204472.75715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204472.75849: variable 'ansible_distribution_major_version' from source: facts 41684 1727204472.75868: Evaluated conditional (ansible_distribution_major_version | int < 8): False 41684 1727204472.75872: when evaluation is False, skipping this task 41684 1727204472.75877: _execute() done 41684 1727204472.75879: dumping result to json 41684 1727204472.75881: done dumping result, returning 41684 1727204472.75889: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-3839-086d-000000000072] 41684 1727204472.75892: sending task result for task 0affcd87-79f5-3839-086d-000000000072 41684 1727204472.76004: done sending task result for task 0affcd87-79f5-3839-086d-000000000072 41684 1727204472.76007: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 41684 1727204472.76055: no more pending results, returning what we have 41684 1727204472.76059: results queue empty 41684 1727204472.76060: checking for any_errors_fatal 41684 1727204472.76068: done checking for any_errors_fatal 41684 1727204472.76069: checking for max_fail_percentage 41684 1727204472.76071: done checking for max_fail_percentage 41684 1727204472.76072: checking to see if all hosts have failed and the running result is not ok 41684 1727204472.76072: done checking to see if all hosts have failed 41684 1727204472.76073: getting the remaining hosts for this loop 41684 1727204472.76075: done getting the remaining hosts for this loop 41684 1727204472.76079: getting the next task for host managed-node1 41684 1727204472.76086: done getting next task for host managed-node1 41684 1727204472.76090: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41684 1727204472.76093: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204472.76110: getting variables 41684 1727204472.76112: in VariableManager get_vars() 41684 1727204472.76155: Calling all_inventory to load vars for managed-node1 41684 1727204472.76158: Calling groups_inventory to load vars for managed-node1 41684 1727204472.76191: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204472.76202: Calling all_plugins_play to load vars for managed-node1 41684 1727204472.76205: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204472.76208: Calling groups_plugins_play to load vars for managed-node1 41684 1727204472.89643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204472.93532: done with get_vars() 41684 1727204472.93573: done getting variables 41684 1727204472.93628: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:01:12 -0400 (0:00:00.348) 0:00:29.338 ***** 41684 1727204472.93665: entering _queue_task() for managed-node1/fail 41684 1727204472.94018: worker is 1 (out of 1 available) 41684 1727204472.94030: exiting _queue_task() for managed-node1/fail 41684 1727204472.94044: done queuing things up, now waiting for results queue to drain 41684 1727204472.94046: waiting for pending results... 41684 1727204472.94458: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41684 1727204472.94595: in run() - task 0affcd87-79f5-3839-086d-000000000073 41684 1727204472.94615: variable 'ansible_search_path' from source: unknown 41684 1727204472.94619: variable 'ansible_search_path' from source: unknown 41684 1727204472.94656: calling self._execute() 41684 1727204472.94757: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204472.94761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204472.94775: variable 'omit' from source: magic vars 41684 1727204472.95323: variable 'ansible_distribution_major_version' from source: facts 41684 1727204472.95335: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204472.95468: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204472.95686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204472.99112: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204472.99202: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204472.99241: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204472.99279: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204472.99306: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204472.99390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204472.99420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204472.99444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204472.99496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204472.99512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204472.99555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204472.99587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204472.99611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204472.99651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204472.99667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204472.99714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204472.99738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204472.99760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204472.99808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204472.99821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204473.00021: variable 'network_connections' from source: task vars 41684 1727204473.00034: variable 'interface1' from source: play vars 41684 1727204473.00118: variable 'interface1' from source: play vars 41684 1727204473.00201: variable 'interface1_mac' from source: set_fact 41684 1727204473.00287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204473.00483: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204473.00519: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204473.00580: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204473.00618: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204473.00663: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204473.00689: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204473.00713: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.00737: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204473.00803: variable '__network_team_connections_defined' from source: role '' defaults 41684 1727204473.01062: variable 'network_connections' from source: task vars 41684 1727204473.01068: variable 'interface1' from source: play vars 41684 1727204473.01127: variable 'interface1' from source: play vars 41684 1727204473.01206: variable 'interface1_mac' from source: set_fact 41684 1727204473.01242: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41684 1727204473.01245: when evaluation is False, skipping this task 41684 1727204473.01248: _execute() done 41684 1727204473.01250: dumping result to json 41684 1727204473.01253: done dumping result, returning 41684 1727204473.01261: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-3839-086d-000000000073] 41684 1727204473.01274: sending task result for task 0affcd87-79f5-3839-086d-000000000073 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41684 1727204473.01431: no more pending results, returning what we have 41684 1727204473.01436: results queue empty 41684 1727204473.01437: checking for any_errors_fatal 41684 1727204473.01445: done checking for any_errors_fatal 41684 1727204473.01446: checking for max_fail_percentage 41684 1727204473.01447: done checking for max_fail_percentage 41684 1727204473.01448: checking to see if all hosts have failed and the running result is not ok 41684 1727204473.01449: done checking to see if all hosts have failed 41684 1727204473.01450: getting the remaining hosts for this loop 41684 1727204473.01452: done getting the remaining hosts for this loop 41684 1727204473.01457: getting the next task for host managed-node1 41684 1727204473.01470: done getting next task for host managed-node1 41684 1727204473.01475: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 41684 1727204473.01478: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204473.01494: done sending task result for task 0affcd87-79f5-3839-086d-000000000073 41684 1727204473.01498: WORKER PROCESS EXITING 41684 1727204473.01506: getting variables 41684 1727204473.01508: in VariableManager get_vars() 41684 1727204473.01552: Calling all_inventory to load vars for managed-node1 41684 1727204473.01555: Calling groups_inventory to load vars for managed-node1 41684 1727204473.01558: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204473.01573: Calling all_plugins_play to load vars for managed-node1 41684 1727204473.01576: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204473.01580: Calling groups_plugins_play to load vars for managed-node1 41684 1727204473.02558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204473.03729: done with get_vars() 41684 1727204473.03752: done getting variables 41684 1727204473.03816: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:01:13 -0400 (0:00:00.101) 0:00:29.440 ***** 41684 1727204473.03853: entering _queue_task() for managed-node1/package 41684 1727204473.04200: worker is 1 (out of 1 available) 41684 1727204473.04212: exiting _queue_task() for managed-node1/package 41684 1727204473.04224: done queuing things up, now waiting for results queue to drain 41684 1727204473.04225: waiting for pending results... 41684 1727204473.04533: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 41684 1727204473.04659: in run() - task 0affcd87-79f5-3839-086d-000000000074 41684 1727204473.04680: variable 'ansible_search_path' from source: unknown 41684 1727204473.04687: variable 'ansible_search_path' from source: unknown 41684 1727204473.04723: calling self._execute() 41684 1727204473.04821: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204473.04824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204473.04834: variable 'omit' from source: magic vars 41684 1727204473.05195: variable 'ansible_distribution_major_version' from source: facts 41684 1727204473.05206: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204473.05345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204473.05543: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204473.05581: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204473.05609: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204473.05673: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204473.05756: variable 'network_packages' from source: role '' defaults 41684 1727204473.05833: variable '__network_provider_setup' from source: role '' defaults 41684 1727204473.05842: variable '__network_service_name_default_nm' from source: role '' defaults 41684 1727204473.05894: variable '__network_service_name_default_nm' from source: role '' defaults 41684 1727204473.05901: variable '__network_packages_default_nm' from source: role '' defaults 41684 1727204473.05951: variable '__network_packages_default_nm' from source: role '' defaults 41684 1727204473.06071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204473.07826: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204473.07890: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204473.07930: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204473.07960: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204473.07991: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204473.08081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204473.08108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204473.08133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.08176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204473.08190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204473.08232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204473.08254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204473.08284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.08322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204473.08335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204473.08570: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41684 1727204473.08685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204473.08705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204473.08729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.08771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204473.08785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204473.08878: variable 'ansible_python' from source: facts 41684 1727204473.08905: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41684 1727204473.08990: variable '__network_wpa_supplicant_required' from source: role '' defaults 41684 1727204473.09069: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41684 1727204473.09193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204473.09216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204473.09247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.09281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204473.09299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204473.09345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204473.09369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204473.09391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.09428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204473.09442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204473.09608: variable 'network_connections' from source: task vars 41684 1727204473.09612: variable 'interface1' from source: play vars 41684 1727204473.09688: variable 'interface1' from source: play vars 41684 1727204473.09790: variable 'interface1_mac' from source: set_fact 41684 1727204473.09855: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204473.09879: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204473.09900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.09923: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204473.09960: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204473.10149: variable 'network_connections' from source: task vars 41684 1727204473.10153: variable 'interface1' from source: play vars 41684 1727204473.10230: variable 'interface1' from source: play vars 41684 1727204473.10319: variable 'interface1_mac' from source: set_fact 41684 1727204473.10373: variable '__network_packages_default_wireless' from source: role '' defaults 41684 1727204473.10430: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204473.10638: variable 'network_connections' from source: task vars 41684 1727204473.10642: variable 'interface1' from source: play vars 41684 1727204473.10692: variable 'interface1' from source: play vars 41684 1727204473.10751: variable 'interface1_mac' from source: set_fact 41684 1727204473.10777: variable '__network_packages_default_team' from source: role '' defaults 41684 1727204473.10835: variable '__network_team_connections_defined' from source: role '' defaults 41684 1727204473.11042: variable 'network_connections' from source: task vars 41684 1727204473.11046: variable 'interface1' from source: play vars 41684 1727204473.11095: variable 'interface1' from source: play vars 41684 1727204473.11155: variable 'interface1_mac' from source: set_fact 41684 1727204473.11206: variable '__network_service_name_default_initscripts' from source: role '' defaults 41684 1727204473.11251: variable '__network_service_name_default_initscripts' from source: role '' defaults 41684 1727204473.11255: variable '__network_packages_default_initscripts' from source: role '' defaults 41684 1727204473.11303: variable '__network_packages_default_initscripts' from source: role '' defaults 41684 1727204473.12101: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41684 1727204473.12106: variable 'network_connections' from source: task vars 41684 1727204473.12109: variable 'interface1' from source: play vars 41684 1727204473.12590: variable 'interface1' from source: play vars 41684 1727204473.12594: variable 'interface1_mac' from source: set_fact 41684 1727204473.12596: variable 'ansible_distribution' from source: facts 41684 1727204473.12598: variable '__network_rh_distros' from source: role '' defaults 41684 1727204473.12600: variable 'ansible_distribution_major_version' from source: facts 41684 1727204473.12603: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41684 1727204473.12605: variable 'ansible_distribution' from source: facts 41684 1727204473.12607: variable '__network_rh_distros' from source: role '' defaults 41684 1727204473.12609: variable 'ansible_distribution_major_version' from source: facts 41684 1727204473.12611: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41684 1727204473.12613: variable 'ansible_distribution' from source: facts 41684 1727204473.12615: variable '__network_rh_distros' from source: role '' defaults 41684 1727204473.12617: variable 'ansible_distribution_major_version' from source: facts 41684 1727204473.12697: variable 'network_provider' from source: set_fact 41684 1727204473.12701: variable 'ansible_facts' from source: unknown 41684 1727204473.13463: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 41684 1727204473.13467: when evaluation is False, skipping this task 41684 1727204473.13471: _execute() done 41684 1727204473.13474: dumping result to json 41684 1727204473.13476: done dumping result, returning 41684 1727204473.13485: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-3839-086d-000000000074] 41684 1727204473.13490: sending task result for task 0affcd87-79f5-3839-086d-000000000074 skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 41684 1727204473.13646: no more pending results, returning what we have 41684 1727204473.13651: results queue empty 41684 1727204473.13652: checking for any_errors_fatal 41684 1727204473.13658: done checking for any_errors_fatal 41684 1727204473.13659: checking for max_fail_percentage 41684 1727204473.13660: done checking for max_fail_percentage 41684 1727204473.13661: checking to see if all hosts have failed and the running result is not ok 41684 1727204473.13662: done checking to see if all hosts have failed 41684 1727204473.13663: getting the remaining hosts for this loop 41684 1727204473.13666: done getting the remaining hosts for this loop 41684 1727204473.13671: getting the next task for host managed-node1 41684 1727204473.13678: done getting next task for host managed-node1 41684 1727204473.13682: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41684 1727204473.13686: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204473.13705: getting variables 41684 1727204473.13706: in VariableManager get_vars() 41684 1727204473.13749: Calling all_inventory to load vars for managed-node1 41684 1727204473.13752: Calling groups_inventory to load vars for managed-node1 41684 1727204473.13754: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204473.13773: Calling all_plugins_play to load vars for managed-node1 41684 1727204473.13782: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204473.13786: Calling groups_plugins_play to load vars for managed-node1 41684 1727204473.14393: done sending task result for task 0affcd87-79f5-3839-086d-000000000074 41684 1727204473.14397: WORKER PROCESS EXITING 41684 1727204473.15259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204473.16486: done with get_vars() 41684 1727204473.16511: done getting variables 41684 1727204473.16558: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:01:13 -0400 (0:00:00.127) 0:00:29.567 ***** 41684 1727204473.16591: entering _queue_task() for managed-node1/package 41684 1727204473.16842: worker is 1 (out of 1 available) 41684 1727204473.16855: exiting _queue_task() for managed-node1/package 41684 1727204473.16872: done queuing things up, now waiting for results queue to drain 41684 1727204473.16874: waiting for pending results... 41684 1727204473.17073: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41684 1727204473.17169: in run() - task 0affcd87-79f5-3839-086d-000000000075 41684 1727204473.17180: variable 'ansible_search_path' from source: unknown 41684 1727204473.17184: variable 'ansible_search_path' from source: unknown 41684 1727204473.17214: calling self._execute() 41684 1727204473.17293: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204473.17297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204473.17305: variable 'omit' from source: magic vars 41684 1727204473.17741: variable 'ansible_distribution_major_version' from source: facts 41684 1727204473.17760: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204473.17918: variable 'network_state' from source: role '' defaults 41684 1727204473.17934: Evaluated conditional (network_state != {}): False 41684 1727204473.17940: when evaluation is False, skipping this task 41684 1727204473.17947: _execute() done 41684 1727204473.17953: dumping result to json 41684 1727204473.17959: done dumping result, returning 41684 1727204473.17975: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-3839-086d-000000000075] 41684 1727204473.17987: sending task result for task 0affcd87-79f5-3839-086d-000000000075 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41684 1727204473.18165: no more pending results, returning what we have 41684 1727204473.18170: results queue empty 41684 1727204473.18171: checking for any_errors_fatal 41684 1727204473.18182: done checking for any_errors_fatal 41684 1727204473.18182: checking for max_fail_percentage 41684 1727204473.18184: done checking for max_fail_percentage 41684 1727204473.18185: checking to see if all hosts have failed and the running result is not ok 41684 1727204473.18186: done checking to see if all hosts have failed 41684 1727204473.18187: getting the remaining hosts for this loop 41684 1727204473.18189: done getting the remaining hosts for this loop 41684 1727204473.18193: getting the next task for host managed-node1 41684 1727204473.18200: done getting next task for host managed-node1 41684 1727204473.18205: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41684 1727204473.18208: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204473.18238: getting variables 41684 1727204473.18241: in VariableManager get_vars() 41684 1727204473.18291: Calling all_inventory to load vars for managed-node1 41684 1727204473.18294: Calling groups_inventory to load vars for managed-node1 41684 1727204473.18297: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204473.18310: Calling all_plugins_play to load vars for managed-node1 41684 1727204473.18313: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204473.18317: Calling groups_plugins_play to load vars for managed-node1 41684 1727204473.19146: done sending task result for task 0affcd87-79f5-3839-086d-000000000075 41684 1727204473.19149: WORKER PROCESS EXITING 41684 1727204473.19631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204473.20575: done with get_vars() 41684 1727204473.20595: done getting variables 41684 1727204473.20643: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:01:13 -0400 (0:00:00.040) 0:00:29.608 ***** 41684 1727204473.20672: entering _queue_task() for managed-node1/package 41684 1727204473.20921: worker is 1 (out of 1 available) 41684 1727204473.20934: exiting _queue_task() for managed-node1/package 41684 1727204473.20947: done queuing things up, now waiting for results queue to drain 41684 1727204473.20948: waiting for pending results... 41684 1727204473.21145: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41684 1727204473.21237: in run() - task 0affcd87-79f5-3839-086d-000000000076 41684 1727204473.21250: variable 'ansible_search_path' from source: unknown 41684 1727204473.21254: variable 'ansible_search_path' from source: unknown 41684 1727204473.21292: calling self._execute() 41684 1727204473.21371: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204473.21377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204473.21387: variable 'omit' from source: magic vars 41684 1727204473.21679: variable 'ansible_distribution_major_version' from source: facts 41684 1727204473.21689: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204473.21780: variable 'network_state' from source: role '' defaults 41684 1727204473.21789: Evaluated conditional (network_state != {}): False 41684 1727204473.21792: when evaluation is False, skipping this task 41684 1727204473.21795: _execute() done 41684 1727204473.21797: dumping result to json 41684 1727204473.21801: done dumping result, returning 41684 1727204473.21808: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-3839-086d-000000000076] 41684 1727204473.21818: sending task result for task 0affcd87-79f5-3839-086d-000000000076 41684 1727204473.21911: done sending task result for task 0affcd87-79f5-3839-086d-000000000076 41684 1727204473.21914: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41684 1727204473.21979: no more pending results, returning what we have 41684 1727204473.21983: results queue empty 41684 1727204473.21984: checking for any_errors_fatal 41684 1727204473.21991: done checking for any_errors_fatal 41684 1727204473.21992: checking for max_fail_percentage 41684 1727204473.21994: done checking for max_fail_percentage 41684 1727204473.21995: checking to see if all hosts have failed and the running result is not ok 41684 1727204473.21995: done checking to see if all hosts have failed 41684 1727204473.21996: getting the remaining hosts for this loop 41684 1727204473.21998: done getting the remaining hosts for this loop 41684 1727204473.22002: getting the next task for host managed-node1 41684 1727204473.22007: done getting next task for host managed-node1 41684 1727204473.22011: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41684 1727204473.22014: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204473.22041: getting variables 41684 1727204473.22043: in VariableManager get_vars() 41684 1727204473.22082: Calling all_inventory to load vars for managed-node1 41684 1727204473.22085: Calling groups_inventory to load vars for managed-node1 41684 1727204473.22087: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204473.22095: Calling all_plugins_play to load vars for managed-node1 41684 1727204473.22098: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204473.22100: Calling groups_plugins_play to load vars for managed-node1 41684 1727204473.22934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204473.23878: done with get_vars() 41684 1727204473.23897: done getting variables 41684 1727204473.23942: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:01:13 -0400 (0:00:00.032) 0:00:29.641 ***** 41684 1727204473.23971: entering _queue_task() for managed-node1/service 41684 1727204473.24219: worker is 1 (out of 1 available) 41684 1727204473.24232: exiting _queue_task() for managed-node1/service 41684 1727204473.24245: done queuing things up, now waiting for results queue to drain 41684 1727204473.24246: waiting for pending results... 41684 1727204473.24448: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41684 1727204473.24541: in run() - task 0affcd87-79f5-3839-086d-000000000077 41684 1727204473.24553: variable 'ansible_search_path' from source: unknown 41684 1727204473.24557: variable 'ansible_search_path' from source: unknown 41684 1727204473.24588: calling self._execute() 41684 1727204473.24670: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204473.24674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204473.24687: variable 'omit' from source: magic vars 41684 1727204473.24969: variable 'ansible_distribution_major_version' from source: facts 41684 1727204473.24982: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204473.25064: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204473.25208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204473.27126: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204473.27178: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204473.27216: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204473.27245: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204473.27266: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204473.27326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204473.27344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204473.27367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.27397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204473.27409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204473.27441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204473.27458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204473.27484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.27510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204473.27520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204473.27548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204473.27563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204473.27588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.27613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204473.27625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204473.27746: variable 'network_connections' from source: task vars 41684 1727204473.27757: variable 'interface1' from source: play vars 41684 1727204473.27821: variable 'interface1' from source: play vars 41684 1727204473.27881: variable 'interface1_mac' from source: set_fact 41684 1727204473.27939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204473.28054: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204473.28084: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204473.28115: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204473.28139: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204473.28175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204473.28190: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204473.28207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.28226: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204473.28278: variable '__network_team_connections_defined' from source: role '' defaults 41684 1727204473.28435: variable 'network_connections' from source: task vars 41684 1727204473.28438: variable 'interface1' from source: play vars 41684 1727204473.28489: variable 'interface1' from source: play vars 41684 1727204473.28540: variable 'interface1_mac' from source: set_fact 41684 1727204473.28577: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41684 1727204473.28581: when evaluation is False, skipping this task 41684 1727204473.28583: _execute() done 41684 1727204473.28586: dumping result to json 41684 1727204473.28588: done dumping result, returning 41684 1727204473.28593: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-3839-086d-000000000077] 41684 1727204473.28607: sending task result for task 0affcd87-79f5-3839-086d-000000000077 41684 1727204473.28701: done sending task result for task 0affcd87-79f5-3839-086d-000000000077 41684 1727204473.28704: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41684 1727204473.28758: no more pending results, returning what we have 41684 1727204473.28762: results queue empty 41684 1727204473.28762: checking for any_errors_fatal 41684 1727204473.28778: done checking for any_errors_fatal 41684 1727204473.28779: checking for max_fail_percentage 41684 1727204473.28781: done checking for max_fail_percentage 41684 1727204473.28782: checking to see if all hosts have failed and the running result is not ok 41684 1727204473.28783: done checking to see if all hosts have failed 41684 1727204473.28783: getting the remaining hosts for this loop 41684 1727204473.28785: done getting the remaining hosts for this loop 41684 1727204473.28789: getting the next task for host managed-node1 41684 1727204473.28795: done getting next task for host managed-node1 41684 1727204473.28799: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41684 1727204473.28802: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204473.28819: getting variables 41684 1727204473.28821: in VariableManager get_vars() 41684 1727204473.28860: Calling all_inventory to load vars for managed-node1 41684 1727204473.28865: Calling groups_inventory to load vars for managed-node1 41684 1727204473.28867: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204473.28881: Calling all_plugins_play to load vars for managed-node1 41684 1727204473.28884: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204473.28891: Calling groups_plugins_play to load vars for managed-node1 41684 1727204473.29851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204473.30786: done with get_vars() 41684 1727204473.30803: done getting variables 41684 1727204473.30851: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:01:13 -0400 (0:00:00.069) 0:00:29.710 ***** 41684 1727204473.30876: entering _queue_task() for managed-node1/service 41684 1727204473.31111: worker is 1 (out of 1 available) 41684 1727204473.31123: exiting _queue_task() for managed-node1/service 41684 1727204473.31137: done queuing things up, now waiting for results queue to drain 41684 1727204473.31138: waiting for pending results... 41684 1727204473.31328: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41684 1727204473.31417: in run() - task 0affcd87-79f5-3839-086d-000000000078 41684 1727204473.31429: variable 'ansible_search_path' from source: unknown 41684 1727204473.31432: variable 'ansible_search_path' from source: unknown 41684 1727204473.31465: calling self._execute() 41684 1727204473.31541: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204473.31544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204473.31553: variable 'omit' from source: magic vars 41684 1727204473.31843: variable 'ansible_distribution_major_version' from source: facts 41684 1727204473.31853: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204473.31969: variable 'network_provider' from source: set_fact 41684 1727204473.31973: variable 'network_state' from source: role '' defaults 41684 1727204473.31980: Evaluated conditional (network_provider == "nm" or network_state != {}): True 41684 1727204473.31985: variable 'omit' from source: magic vars 41684 1727204473.32025: variable 'omit' from source: magic vars 41684 1727204473.32049: variable 'network_service_name' from source: role '' defaults 41684 1727204473.32102: variable 'network_service_name' from source: role '' defaults 41684 1727204473.32179: variable '__network_provider_setup' from source: role '' defaults 41684 1727204473.32183: variable '__network_service_name_default_nm' from source: role '' defaults 41684 1727204473.32228: variable '__network_service_name_default_nm' from source: role '' defaults 41684 1727204473.32237: variable '__network_packages_default_nm' from source: role '' defaults 41684 1727204473.32285: variable '__network_packages_default_nm' from source: role '' defaults 41684 1727204473.32433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204473.34025: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204473.34080: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204473.34112: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204473.34139: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204473.34159: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204473.34222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204473.34242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204473.34259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.34289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204473.34304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204473.34335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204473.34351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204473.34369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.34395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204473.34407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204473.34565: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41684 1727204473.34644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204473.34666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204473.34681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.34709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204473.34720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204473.34788: variable 'ansible_python' from source: facts 41684 1727204473.34806: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41684 1727204473.34868: variable '__network_wpa_supplicant_required' from source: role '' defaults 41684 1727204473.34924: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41684 1727204473.35013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204473.35030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204473.35046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.35079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204473.35088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204473.35123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204473.35142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204473.35159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.35191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204473.35200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204473.35299: variable 'network_connections' from source: task vars 41684 1727204473.35303: variable 'interface1' from source: play vars 41684 1727204473.35357: variable 'interface1' from source: play vars 41684 1727204473.35443: variable 'interface1_mac' from source: set_fact 41684 1727204473.35540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204473.35682: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204473.35717: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204473.35754: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204473.35788: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204473.35833: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204473.35857: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204473.35886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.35910: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204473.35945: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204473.36135: variable 'network_connections' from source: task vars 41684 1727204473.36141: variable 'interface1' from source: play vars 41684 1727204473.36201: variable 'interface1' from source: play vars 41684 1727204473.36266: variable 'interface1_mac' from source: set_fact 41684 1727204473.36313: variable '__network_packages_default_wireless' from source: role '' defaults 41684 1727204473.36370: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204473.36567: variable 'network_connections' from source: task vars 41684 1727204473.36573: variable 'interface1' from source: play vars 41684 1727204473.36627: variable 'interface1' from source: play vars 41684 1727204473.36691: variable 'interface1_mac' from source: set_fact 41684 1727204473.36719: variable '__network_packages_default_team' from source: role '' defaults 41684 1727204473.36774: variable '__network_team_connections_defined' from source: role '' defaults 41684 1727204473.36971: variable 'network_connections' from source: task vars 41684 1727204473.36975: variable 'interface1' from source: play vars 41684 1727204473.37029: variable 'interface1' from source: play vars 41684 1727204473.37092: variable 'interface1_mac' from source: set_fact 41684 1727204473.37143: variable '__network_service_name_default_initscripts' from source: role '' defaults 41684 1727204473.37460: variable '__network_service_name_default_initscripts' from source: role '' defaults 41684 1727204473.37466: variable '__network_packages_default_initscripts' from source: role '' defaults 41684 1727204473.37469: variable '__network_packages_default_initscripts' from source: role '' defaults 41684 1727204473.37508: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41684 1727204473.38018: variable 'network_connections' from source: task vars 41684 1727204473.38021: variable 'interface1' from source: play vars 41684 1727204473.38087: variable 'interface1' from source: play vars 41684 1727204473.38155: variable 'interface1_mac' from source: set_fact 41684 1727204473.38177: variable 'ansible_distribution' from source: facts 41684 1727204473.38180: variable '__network_rh_distros' from source: role '' defaults 41684 1727204473.38190: variable 'ansible_distribution_major_version' from source: facts 41684 1727204473.38211: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41684 1727204473.38389: variable 'ansible_distribution' from source: facts 41684 1727204473.38393: variable '__network_rh_distros' from source: role '' defaults 41684 1727204473.38395: variable 'ansible_distribution_major_version' from source: facts 41684 1727204473.38409: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41684 1727204473.38561: variable 'ansible_distribution' from source: facts 41684 1727204473.38564: variable '__network_rh_distros' from source: role '' defaults 41684 1727204473.38573: variable 'ansible_distribution_major_version' from source: facts 41684 1727204473.38600: variable 'network_provider' from source: set_fact 41684 1727204473.38616: variable 'omit' from source: magic vars 41684 1727204473.38638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204473.38661: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204473.38682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204473.38695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204473.38703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204473.38724: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204473.38727: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204473.38730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204473.38806: Set connection var ansible_connection to ssh 41684 1727204473.38810: Set connection var ansible_pipelining to False 41684 1727204473.38816: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204473.38821: Set connection var ansible_timeout to 10 41684 1727204473.38827: Set connection var ansible_shell_executable to /bin/sh 41684 1727204473.38834: Set connection var ansible_shell_type to sh 41684 1727204473.38860: variable 'ansible_shell_executable' from source: unknown 41684 1727204473.38867: variable 'ansible_connection' from source: unknown 41684 1727204473.38869: variable 'ansible_module_compression' from source: unknown 41684 1727204473.38873: variable 'ansible_shell_type' from source: unknown 41684 1727204473.38877: variable 'ansible_shell_executable' from source: unknown 41684 1727204473.38880: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204473.38884: variable 'ansible_pipelining' from source: unknown 41684 1727204473.38887: variable 'ansible_timeout' from source: unknown 41684 1727204473.38891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204473.38970: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204473.38977: variable 'omit' from source: magic vars 41684 1727204473.38989: starting attempt loop 41684 1727204473.38992: running the handler 41684 1727204473.39044: variable 'ansible_facts' from source: unknown 41684 1727204473.39650: _low_level_execute_command(): starting 41684 1727204473.39654: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204473.40165: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204473.40179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204473.40211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204473.40223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204473.40227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204473.40282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204473.40294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204473.40304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204473.40373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204473.42042: stdout chunk (state=3): >>>/root <<< 41684 1727204473.42172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204473.42247: stderr chunk (state=3): >>><<< 41684 1727204473.42257: stdout chunk (state=3): >>><<< 41684 1727204473.42297: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204473.42316: _low_level_execute_command(): starting 41684 1727204473.42327: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204473.4230454-43904-278354985201612 `" && echo ansible-tmp-1727204473.4230454-43904-278354985201612="` echo /root/.ansible/tmp/ansible-tmp-1727204473.4230454-43904-278354985201612 `" ) && sleep 0' 41684 1727204473.43068: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204473.43082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204473.43097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204473.43115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204473.43169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204473.43183: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204473.43197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204473.43213: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204473.43225: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204473.43236: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204473.43259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204473.43286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204473.43303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204473.43317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204473.43329: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204473.43343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204473.43430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204473.43453: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204473.43482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204473.43567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204473.45419: stdout chunk (state=3): >>>ansible-tmp-1727204473.4230454-43904-278354985201612=/root/.ansible/tmp/ansible-tmp-1727204473.4230454-43904-278354985201612 <<< 41684 1727204473.45540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204473.45603: stderr chunk (state=3): >>><<< 41684 1727204473.45606: stdout chunk (state=3): >>><<< 41684 1727204473.45645: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204473.4230454-43904-278354985201612=/root/.ansible/tmp/ansible-tmp-1727204473.4230454-43904-278354985201612 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204473.45842: variable 'ansible_module_compression' from source: unknown 41684 1727204473.45845: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 41684 1727204473.45847: variable 'ansible_facts' from source: unknown 41684 1727204473.46128: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204473.4230454-43904-278354985201612/AnsiballZ_systemd.py 41684 1727204473.46222: Sending initial data 41684 1727204473.46228: Sent initial data (156 bytes) 41684 1727204473.47797: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204473.47803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204473.47813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204473.47824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204473.47855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204473.47868: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204473.47871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204473.47890: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204473.47894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204473.47902: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204473.47907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204473.47954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204473.47980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204473.47983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204473.48044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204473.49733: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204473.49793: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204473.49845: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpnl3kfkeh /root/.ansible/tmp/ansible-tmp-1727204473.4230454-43904-278354985201612/AnsiballZ_systemd.py <<< 41684 1727204473.49900: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204473.53088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204473.53206: stderr chunk (state=3): >>><<< 41684 1727204473.53210: stdout chunk (state=3): >>><<< 41684 1727204473.53231: done transferring module to remote 41684 1727204473.53241: _low_level_execute_command(): starting 41684 1727204473.53247: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204473.4230454-43904-278354985201612/ /root/.ansible/tmp/ansible-tmp-1727204473.4230454-43904-278354985201612/AnsiballZ_systemd.py && sleep 0' 41684 1727204473.54379: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204473.54388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204473.54399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204473.54412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204473.54451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204473.54458: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204473.54479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204473.54495: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204473.54498: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204473.54505: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204473.54513: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204473.54522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204473.54538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204473.54540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204473.54546: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204473.54555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204473.54630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204473.54645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204473.54655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204473.54742: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204473.57000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204473.57448: stderr chunk (state=3): >>><<< 41684 1727204473.57452: stdout chunk (state=3): >>><<< 41684 1727204473.57477: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204473.57481: _low_level_execute_command(): starting 41684 1727204473.57483: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204473.4230454-43904-278354985201612/AnsiballZ_systemd.py && sleep 0' 41684 1727204473.58164: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204473.58171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204473.58174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204473.58189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204473.58225: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204473.58232: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204473.58242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204473.58256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204473.58272: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204473.58275: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204473.58281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204473.58292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204473.58302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204473.58310: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204473.58316: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204473.58325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204473.58428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204473.58432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204473.58435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204473.58579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204473.83561: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ExecMainStartTimestampMonotonic": "28837083", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 41684 1727204473.83577: stdout chunk (state=3): >>>service", "ControlGroupId": "2418", "MemoryCurrent": "14225408", "MemoryAvailable": "infinity", "CPUUsageNSec": "1447802000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogS<<< 41684 1727204473.83643: stdout chunk (state=3): >>>ignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:21 EDT", "StateChangeTimestampMonotonic": "324827295", "InactiveExitTimestamp": "Tue 2024-09-24 14:49:25 EDT", "InactiveExitTimestampMonotonic": "28837278", "ActiveEnterTimestamp": "Tue 2024-09-24 14:49:27 EDT", "ActiveEnterTimestampMonotonic": "30313565", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ConditionTimestampMonotonic": "28833288", "AssertTimestamp": "Tue 2024-09-24 14:49:25 EDT", "AssertTimestampMonotonic": "28833291", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a065c0d4382c4b51bfc5a74ffa3d403d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 41684 1727204473.85181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204473.85236: stderr chunk (state=3): >>><<< 41684 1727204473.85241: stdout chunk (state=3): >>><<< 41684 1727204473.85257: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ExecMainStartTimestampMonotonic": "28837083", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "14225408", "MemoryAvailable": "infinity", "CPUUsageNSec": "1447802000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:21 EDT", "StateChangeTimestampMonotonic": "324827295", "InactiveExitTimestamp": "Tue 2024-09-24 14:49:25 EDT", "InactiveExitTimestampMonotonic": "28837278", "ActiveEnterTimestamp": "Tue 2024-09-24 14:49:27 EDT", "ActiveEnterTimestampMonotonic": "30313565", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ConditionTimestampMonotonic": "28833288", "AssertTimestamp": "Tue 2024-09-24 14:49:25 EDT", "AssertTimestampMonotonic": "28833291", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a065c0d4382c4b51bfc5a74ffa3d403d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204473.85373: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204473.4230454-43904-278354985201612/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204473.85388: _low_level_execute_command(): starting 41684 1727204473.85393: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204473.4230454-43904-278354985201612/ > /dev/null 2>&1 && sleep 0' 41684 1727204473.86158: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204473.86217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204473.88130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204473.88133: stdout chunk (state=3): >>><<< 41684 1727204473.88136: stderr chunk (state=3): >>><<< 41684 1727204473.88176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204473.88180: handler run complete 41684 1727204473.88351: attempt loop complete, returning result 41684 1727204473.88354: _execute() done 41684 1727204473.88357: dumping result to json 41684 1727204473.88359: done dumping result, returning 41684 1727204473.88366: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-3839-086d-000000000078] 41684 1727204473.88369: sending task result for task 0affcd87-79f5-3839-086d-000000000078 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41684 1727204473.88725: no more pending results, returning what we have 41684 1727204473.88728: results queue empty 41684 1727204473.88729: checking for any_errors_fatal 41684 1727204473.88734: done checking for any_errors_fatal 41684 1727204473.88734: checking for max_fail_percentage 41684 1727204473.88736: done checking for max_fail_percentage 41684 1727204473.88737: checking to see if all hosts have failed and the running result is not ok 41684 1727204473.88739: done checking to see if all hosts have failed 41684 1727204473.88740: getting the remaining hosts for this loop 41684 1727204473.88741: done getting the remaining hosts for this loop 41684 1727204473.88745: getting the next task for host managed-node1 41684 1727204473.88752: done getting next task for host managed-node1 41684 1727204473.88756: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41684 1727204473.88758: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204473.88773: done sending task result for task 0affcd87-79f5-3839-086d-000000000078 41684 1727204473.88777: WORKER PROCESS EXITING 41684 1727204473.88782: getting variables 41684 1727204473.88784: in VariableManager get_vars() 41684 1727204473.88822: Calling all_inventory to load vars for managed-node1 41684 1727204473.88825: Calling groups_inventory to load vars for managed-node1 41684 1727204473.88827: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204473.88836: Calling all_plugins_play to load vars for managed-node1 41684 1727204473.88838: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204473.88841: Calling groups_plugins_play to load vars for managed-node1 41684 1727204473.89823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204473.91613: done with get_vars() 41684 1727204473.91642: done getting variables 41684 1727204473.91719: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:01:13 -0400 (0:00:00.608) 0:00:30.319 ***** 41684 1727204473.91756: entering _queue_task() for managed-node1/service 41684 1727204473.92141: worker is 1 (out of 1 available) 41684 1727204473.92153: exiting _queue_task() for managed-node1/service 41684 1727204473.92170: done queuing things up, now waiting for results queue to drain 41684 1727204473.92172: waiting for pending results... 41684 1727204473.92512: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41684 1727204473.92621: in run() - task 0affcd87-79f5-3839-086d-000000000079 41684 1727204473.92635: variable 'ansible_search_path' from source: unknown 41684 1727204473.92639: variable 'ansible_search_path' from source: unknown 41684 1727204473.92677: calling self._execute() 41684 1727204473.92755: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204473.92760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204473.92773: variable 'omit' from source: magic vars 41684 1727204473.93111: variable 'ansible_distribution_major_version' from source: facts 41684 1727204473.93121: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204473.93204: variable 'network_provider' from source: set_fact 41684 1727204473.93210: Evaluated conditional (network_provider == "nm"): True 41684 1727204473.93281: variable '__network_wpa_supplicant_required' from source: role '' defaults 41684 1727204473.93342: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41684 1727204473.93465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204473.96080: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204473.96126: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204473.96156: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204473.96184: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204473.96204: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204473.96274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204473.96294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204473.96312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.96366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204473.96534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204473.96578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204473.96636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204473.96693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.96792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204473.96833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204473.96931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204473.96982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204473.97011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.97358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204473.97375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204473.97521: variable 'network_connections' from source: task vars 41684 1727204473.97534: variable 'interface1' from source: play vars 41684 1727204473.97619: variable 'interface1' from source: play vars 41684 1727204473.97708: variable 'interface1_mac' from source: set_fact 41684 1727204473.97812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204473.98023: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204473.98076: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204473.98109: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204473.98141: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204473.98294: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204473.98318: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204473.98346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204473.99015: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204473.99199: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204473.99915: variable 'network_connections' from source: task vars 41684 1727204473.99926: variable 'interface1' from source: play vars 41684 1727204474.00004: variable 'interface1' from source: play vars 41684 1727204474.00203: variable 'interface1_mac' from source: set_fact 41684 1727204474.00397: Evaluated conditional (__network_wpa_supplicant_required): False 41684 1727204474.00405: when evaluation is False, skipping this task 41684 1727204474.00413: _execute() done 41684 1727204474.00421: dumping result to json 41684 1727204474.00428: done dumping result, returning 41684 1727204474.00469: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-3839-086d-000000000079] 41684 1727204474.00577: sending task result for task 0affcd87-79f5-3839-086d-000000000079 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 41684 1727204474.00737: no more pending results, returning what we have 41684 1727204474.00742: results queue empty 41684 1727204474.00743: checking for any_errors_fatal 41684 1727204474.00767: done checking for any_errors_fatal 41684 1727204474.00768: checking for max_fail_percentage 41684 1727204474.00770: done checking for max_fail_percentage 41684 1727204474.00771: checking to see if all hosts have failed and the running result is not ok 41684 1727204474.00772: done checking to see if all hosts have failed 41684 1727204474.00773: getting the remaining hosts for this loop 41684 1727204474.00775: done getting the remaining hosts for this loop 41684 1727204474.00780: getting the next task for host managed-node1 41684 1727204474.00788: done getting next task for host managed-node1 41684 1727204474.00792: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 41684 1727204474.00795: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204474.00814: getting variables 41684 1727204474.00817: in VariableManager get_vars() 41684 1727204474.00869: Calling all_inventory to load vars for managed-node1 41684 1727204474.00872: Calling groups_inventory to load vars for managed-node1 41684 1727204474.00874: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204474.00885: Calling all_plugins_play to load vars for managed-node1 41684 1727204474.00888: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204474.00890: Calling groups_plugins_play to load vars for managed-node1 41684 1727204474.02078: done sending task result for task 0affcd87-79f5-3839-086d-000000000079 41684 1727204474.02082: WORKER PROCESS EXITING 41684 1727204474.04065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204474.07737: done with get_vars() 41684 1727204474.07777: done getting variables 41684 1727204474.07979: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:01:14 -0400 (0:00:00.162) 0:00:30.481 ***** 41684 1727204474.08013: entering _queue_task() for managed-node1/service 41684 1727204474.08879: worker is 1 (out of 1 available) 41684 1727204474.08891: exiting _queue_task() for managed-node1/service 41684 1727204474.08905: done queuing things up, now waiting for results queue to drain 41684 1727204474.08907: waiting for pending results... 41684 1727204474.09771: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 41684 1727204474.10080: in run() - task 0affcd87-79f5-3839-086d-00000000007a 41684 1727204474.10101: variable 'ansible_search_path' from source: unknown 41684 1727204474.10105: variable 'ansible_search_path' from source: unknown 41684 1727204474.10186: calling self._execute() 41684 1727204474.10288: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204474.10293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204474.10303: variable 'omit' from source: magic vars 41684 1727204474.10661: variable 'ansible_distribution_major_version' from source: facts 41684 1727204474.10675: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204474.10781: variable 'network_provider' from source: set_fact 41684 1727204474.10787: Evaluated conditional (network_provider == "initscripts"): False 41684 1727204474.10790: when evaluation is False, skipping this task 41684 1727204474.10793: _execute() done 41684 1727204474.10795: dumping result to json 41684 1727204474.10798: done dumping result, returning 41684 1727204474.10810: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-3839-086d-00000000007a] 41684 1727204474.10813: sending task result for task 0affcd87-79f5-3839-086d-00000000007a 41684 1727204474.10913: done sending task result for task 0affcd87-79f5-3839-086d-00000000007a 41684 1727204474.10916: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41684 1727204474.10959: no more pending results, returning what we have 41684 1727204474.10967: results queue empty 41684 1727204474.10968: checking for any_errors_fatal 41684 1727204474.10975: done checking for any_errors_fatal 41684 1727204474.10975: checking for max_fail_percentage 41684 1727204474.10977: done checking for max_fail_percentage 41684 1727204474.10978: checking to see if all hosts have failed and the running result is not ok 41684 1727204474.10979: done checking to see if all hosts have failed 41684 1727204474.10979: getting the remaining hosts for this loop 41684 1727204474.10981: done getting the remaining hosts for this loop 41684 1727204474.10985: getting the next task for host managed-node1 41684 1727204474.10992: done getting next task for host managed-node1 41684 1727204474.10995: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41684 1727204474.10999: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204474.11018: getting variables 41684 1727204474.11020: in VariableManager get_vars() 41684 1727204474.11058: Calling all_inventory to load vars for managed-node1 41684 1727204474.11065: Calling groups_inventory to load vars for managed-node1 41684 1727204474.11067: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204474.11077: Calling all_plugins_play to load vars for managed-node1 41684 1727204474.11080: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204474.11082: Calling groups_plugins_play to load vars for managed-node1 41684 1727204474.13109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204474.15945: done with get_vars() 41684 1727204474.15982: done getting variables 41684 1727204474.16095: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:01:14 -0400 (0:00:00.081) 0:00:30.562 ***** 41684 1727204474.16129: entering _queue_task() for managed-node1/copy 41684 1727204474.16939: worker is 1 (out of 1 available) 41684 1727204474.16952: exiting _queue_task() for managed-node1/copy 41684 1727204474.16968: done queuing things up, now waiting for results queue to drain 41684 1727204474.16970: waiting for pending results... 41684 1727204474.17892: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41684 1727204474.18256: in run() - task 0affcd87-79f5-3839-086d-00000000007b 41684 1727204474.18272: variable 'ansible_search_path' from source: unknown 41684 1727204474.18276: variable 'ansible_search_path' from source: unknown 41684 1727204474.18314: calling self._execute() 41684 1727204474.18530: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204474.18534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204474.18545: variable 'omit' from source: magic vars 41684 1727204474.19488: variable 'ansible_distribution_major_version' from source: facts 41684 1727204474.19493: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204474.19822: variable 'network_provider' from source: set_fact 41684 1727204474.19826: Evaluated conditional (network_provider == "initscripts"): False 41684 1727204474.19829: when evaluation is False, skipping this task 41684 1727204474.19832: _execute() done 41684 1727204474.19834: dumping result to json 41684 1727204474.19839: done dumping result, returning 41684 1727204474.19848: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-3839-086d-00000000007b] 41684 1727204474.19854: sending task result for task 0affcd87-79f5-3839-086d-00000000007b 41684 1727204474.20078: done sending task result for task 0affcd87-79f5-3839-086d-00000000007b 41684 1727204474.20081: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 41684 1727204474.20151: no more pending results, returning what we have 41684 1727204474.20156: results queue empty 41684 1727204474.20157: checking for any_errors_fatal 41684 1727204474.20163: done checking for any_errors_fatal 41684 1727204474.20166: checking for max_fail_percentage 41684 1727204474.20168: done checking for max_fail_percentage 41684 1727204474.20169: checking to see if all hosts have failed and the running result is not ok 41684 1727204474.20170: done checking to see if all hosts have failed 41684 1727204474.20171: getting the remaining hosts for this loop 41684 1727204474.20173: done getting the remaining hosts for this loop 41684 1727204474.20178: getting the next task for host managed-node1 41684 1727204474.20187: done getting next task for host managed-node1 41684 1727204474.20191: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41684 1727204474.20194: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204474.20217: getting variables 41684 1727204474.20219: in VariableManager get_vars() 41684 1727204474.20270: Calling all_inventory to load vars for managed-node1 41684 1727204474.20274: Calling groups_inventory to load vars for managed-node1 41684 1727204474.20277: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204474.20290: Calling all_plugins_play to load vars for managed-node1 41684 1727204474.20293: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204474.20297: Calling groups_plugins_play to load vars for managed-node1 41684 1727204474.23252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204474.26848: done with get_vars() 41684 1727204474.26882: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:01:14 -0400 (0:00:00.110) 0:00:30.673 ***** 41684 1727204474.27166: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 41684 1727204474.27834: worker is 1 (out of 1 available) 41684 1727204474.27847: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 41684 1727204474.27861: done queuing things up, now waiting for results queue to drain 41684 1727204474.27862: waiting for pending results... 41684 1727204474.28874: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41684 1727204474.29294: in run() - task 0affcd87-79f5-3839-086d-00000000007c 41684 1727204474.29298: variable 'ansible_search_path' from source: unknown 41684 1727204474.29301: variable 'ansible_search_path' from source: unknown 41684 1727204474.29304: calling self._execute() 41684 1727204474.29770: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204474.29774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204474.29777: variable 'omit' from source: magic vars 41684 1727204474.30255: variable 'ansible_distribution_major_version' from source: facts 41684 1727204474.30268: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204474.30309: variable 'omit' from source: magic vars 41684 1727204474.30360: variable 'omit' from source: magic vars 41684 1727204474.30755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204474.36141: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204474.36416: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204474.36452: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204474.36487: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204474.36630: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204474.36782: variable 'network_provider' from source: set_fact 41684 1727204474.37035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204474.37070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204474.37096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204474.37135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204474.37271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204474.37342: variable 'omit' from source: magic vars 41684 1727204474.37683: variable 'omit' from source: magic vars 41684 1727204474.37901: variable 'network_connections' from source: task vars 41684 1727204474.37921: variable 'interface1' from source: play vars 41684 1727204474.37986: variable 'interface1' from source: play vars 41684 1727204474.38568: variable 'interface1_mac' from source: set_fact 41684 1727204474.38620: variable 'omit' from source: magic vars 41684 1727204474.38628: variable '__lsr_ansible_managed' from source: task vars 41684 1727204474.38808: variable '__lsr_ansible_managed' from source: task vars 41684 1727204474.39279: Loaded config def from plugin (lookup/template) 41684 1727204474.39282: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 41684 1727204474.39425: File lookup term: get_ansible_managed.j2 41684 1727204474.39429: variable 'ansible_search_path' from source: unknown 41684 1727204474.39441: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 41684 1727204474.39454: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 41684 1727204474.39472: variable 'ansible_search_path' from source: unknown 41684 1727204474.53479: variable 'ansible_managed' from source: unknown 41684 1727204474.53732: variable 'omit' from source: magic vars 41684 1727204474.53880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204474.53908: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204474.53927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204474.53944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204474.53953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204474.54196: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204474.54199: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204474.54202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204474.54415: Set connection var ansible_connection to ssh 41684 1727204474.54420: Set connection var ansible_pipelining to False 41684 1727204474.54426: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204474.54431: Set connection var ansible_timeout to 10 41684 1727204474.54438: Set connection var ansible_shell_executable to /bin/sh 41684 1727204474.54441: Set connection var ansible_shell_type to sh 41684 1727204474.54468: variable 'ansible_shell_executable' from source: unknown 41684 1727204474.54471: variable 'ansible_connection' from source: unknown 41684 1727204474.54473: variable 'ansible_module_compression' from source: unknown 41684 1727204474.54476: variable 'ansible_shell_type' from source: unknown 41684 1727204474.54478: variable 'ansible_shell_executable' from source: unknown 41684 1727204474.54481: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204474.54483: variable 'ansible_pipelining' from source: unknown 41684 1727204474.54486: variable 'ansible_timeout' from source: unknown 41684 1727204474.54488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204474.54856: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204474.54871: variable 'omit' from source: magic vars 41684 1727204474.54874: starting attempt loop 41684 1727204474.54877: running the handler 41684 1727204474.54891: _low_level_execute_command(): starting 41684 1727204474.54898: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204474.56889: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204474.56897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204474.56956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204474.56960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41684 1727204474.57027: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204474.57031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204474.57072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204474.57188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204474.57194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204474.57359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204474.57465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204474.59125: stdout chunk (state=3): >>>/root <<< 41684 1727204474.59309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204474.59313: stderr chunk (state=3): >>><<< 41684 1727204474.59315: stdout chunk (state=3): >>><<< 41684 1727204474.59338: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204474.59350: _low_level_execute_command(): starting 41684 1727204474.59358: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204474.5933867-44001-259965220064234 `" && echo ansible-tmp-1727204474.5933867-44001-259965220064234="` echo /root/.ansible/tmp/ansible-tmp-1727204474.5933867-44001-259965220064234 `" ) && sleep 0' 41684 1727204474.59952: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204474.59968: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204474.59974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204474.59987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204474.60022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204474.60029: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204474.60038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204474.60052: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204474.60059: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204474.60068: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204474.60077: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204474.60084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204474.60096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204474.60103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204474.60109: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204474.60117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204474.60187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204474.60200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204474.60209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204474.60295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204474.62174: stdout chunk (state=3): >>>ansible-tmp-1727204474.5933867-44001-259965220064234=/root/.ansible/tmp/ansible-tmp-1727204474.5933867-44001-259965220064234 <<< 41684 1727204474.62355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204474.62358: stderr chunk (state=3): >>><<< 41684 1727204474.62361: stdout chunk (state=3): >>><<< 41684 1727204474.62379: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204474.5933867-44001-259965220064234=/root/.ansible/tmp/ansible-tmp-1727204474.5933867-44001-259965220064234 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204474.62427: variable 'ansible_module_compression' from source: unknown 41684 1727204474.62475: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 41684 1727204474.62506: variable 'ansible_facts' from source: unknown 41684 1727204474.62604: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204474.5933867-44001-259965220064234/AnsiballZ_network_connections.py 41684 1727204474.62747: Sending initial data 41684 1727204474.62750: Sent initial data (168 bytes) 41684 1727204474.63682: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204474.63693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204474.63729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204474.63732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204474.63768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204474.63773: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204474.63783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204474.63944: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204474.63952: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204474.63955: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204474.63966: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204474.63969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204474.63971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204474.63973: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204474.63976: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204474.63978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204474.63980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204474.63982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204474.63984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204474.64639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204474.66341: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204474.66394: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204474.66449: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmp03eywohj /root/.ansible/tmp/ansible-tmp-1727204474.5933867-44001-259965220064234/AnsiballZ_network_connections.py <<< 41684 1727204474.66501: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204474.69070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204474.69093: stderr chunk (state=3): >>><<< 41684 1727204474.69096: stdout chunk (state=3): >>><<< 41684 1727204474.69187: done transferring module to remote 41684 1727204474.69191: _low_level_execute_command(): starting 41684 1727204474.69193: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204474.5933867-44001-259965220064234/ /root/.ansible/tmp/ansible-tmp-1727204474.5933867-44001-259965220064234/AnsiballZ_network_connections.py && sleep 0' 41684 1727204474.70403: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204474.70408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204474.70428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204474.70460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204474.70468: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 41684 1727204474.70481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204474.70487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204474.70501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204474.70592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204474.70623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204474.70684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204474.72437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204474.72441: stderr chunk (state=3): >>><<< 41684 1727204474.72450: stdout chunk (state=3): >>><<< 41684 1727204474.72467: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204474.72471: _low_level_execute_command(): starting 41684 1727204474.72473: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204474.5933867-44001-259965220064234/AnsiballZ_network_connections.py && sleep 0' 41684 1727204474.73128: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204474.73136: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204474.73146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204474.73156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204474.73189: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204474.73196: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204474.73205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204474.73216: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204474.73219: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204474.73224: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204474.73231: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204474.73236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204474.73247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204474.73254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204474.73259: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204474.73267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204474.73331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204474.73336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204474.73340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204474.73412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204474.98813: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 7d6131f1-a08f-4727-b007-3042c5fbcd66\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "96:07:24:63:96:ac", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "96:07:24:63:96:ac", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 41684 1727204475.00155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204475.00302: stderr chunk (state=3): >>><<< 41684 1727204475.00308: stdout chunk (state=3): >>><<< 41684 1727204475.00320: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 7d6131f1-a08f-4727-b007-3042c5fbcd66\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "96:07:24:63:96:ac", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "96:07:24:63:96:ac", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204475.00396: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest1', 'mac': '96:07:24:63:96:ac', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.4/24', '2001:db8::6/32'], 'route': [{'network': '198.58.10.64', 'prefix': 26, 'gateway': '198.51.100.102', 'metric': 4}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204474.5933867-44001-259965220064234/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204475.00399: _low_level_execute_command(): starting 41684 1727204475.00402: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204474.5933867-44001-259965220064234/ > /dev/null 2>&1 && sleep 0' 41684 1727204475.01371: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204475.01377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204475.01388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204475.01405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204475.01523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204475.01527: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204475.01535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204475.01551: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204475.01554: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204475.01566: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204475.01569: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204475.01588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204475.01594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204475.01597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204475.01602: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204475.01621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204475.01691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204475.01723: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204475.01730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204475.01826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204475.03597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204475.03722: stderr chunk (state=3): >>><<< 41684 1727204475.03742: stdout chunk (state=3): >>><<< 41684 1727204475.03906: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204475.03910: handler run complete 41684 1727204475.03913: attempt loop complete, returning result 41684 1727204475.03923: _execute() done 41684 1727204475.03927: dumping result to json 41684 1727204475.03929: done dumping result, returning 41684 1727204475.03931: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-3839-086d-00000000007c] 41684 1727204475.03933: sending task result for task 0affcd87-79f5-3839-086d-00000000007c 41684 1727204475.04025: done sending task result for task 0affcd87-79f5-3839-086d-00000000007c 41684 1727204475.04034: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "address": [ "198.51.100.4/24", "2001:db8::6/32" ], "route": [ { "gateway": "198.51.100.102", "metric": 4, "network": "198.58.10.64", "prefix": 26 } ] }, "mac": "96:07:24:63:96:ac", "name": "ethtest1", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 7d6131f1-a08f-4727-b007-3042c5fbcd66 41684 1727204475.04215: no more pending results, returning what we have 41684 1727204475.04220: results queue empty 41684 1727204475.04221: checking for any_errors_fatal 41684 1727204475.04227: done checking for any_errors_fatal 41684 1727204475.04228: checking for max_fail_percentage 41684 1727204475.04230: done checking for max_fail_percentage 41684 1727204475.04231: checking to see if all hosts have failed and the running result is not ok 41684 1727204475.04232: done checking to see if all hosts have failed 41684 1727204475.04233: getting the remaining hosts for this loop 41684 1727204475.04234: done getting the remaining hosts for this loop 41684 1727204475.04239: getting the next task for host managed-node1 41684 1727204475.04246: done getting next task for host managed-node1 41684 1727204475.04249: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 41684 1727204475.04252: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204475.04268: getting variables 41684 1727204475.04270: in VariableManager get_vars() 41684 1727204475.04314: Calling all_inventory to load vars for managed-node1 41684 1727204475.04317: Calling groups_inventory to load vars for managed-node1 41684 1727204475.04320: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204475.04331: Calling all_plugins_play to load vars for managed-node1 41684 1727204475.04334: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204475.04337: Calling groups_plugins_play to load vars for managed-node1 41684 1727204475.06970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204475.10069: done with get_vars() 41684 1727204475.10098: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:01:15 -0400 (0:00:00.830) 0:00:31.503 ***** 41684 1727204475.10194: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 41684 1727204475.10643: worker is 1 (out of 1 available) 41684 1727204475.10660: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 41684 1727204475.10679: done queuing things up, now waiting for results queue to drain 41684 1727204475.10681: waiting for pending results... 41684 1727204475.10987: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 41684 1727204475.11081: in run() - task 0affcd87-79f5-3839-086d-00000000007d 41684 1727204475.11093: variable 'ansible_search_path' from source: unknown 41684 1727204475.11098: variable 'ansible_search_path' from source: unknown 41684 1727204475.11128: calling self._execute() 41684 1727204475.11222: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204475.11226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204475.11235: variable 'omit' from source: magic vars 41684 1727204475.11535: variable 'ansible_distribution_major_version' from source: facts 41684 1727204475.11546: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204475.11631: variable 'network_state' from source: role '' defaults 41684 1727204475.11639: Evaluated conditional (network_state != {}): False 41684 1727204475.11644: when evaluation is False, skipping this task 41684 1727204475.11649: _execute() done 41684 1727204475.11652: dumping result to json 41684 1727204475.11654: done dumping result, returning 41684 1727204475.11657: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-3839-086d-00000000007d] 41684 1727204475.11666: sending task result for task 0affcd87-79f5-3839-086d-00000000007d 41684 1727204475.11750: done sending task result for task 0affcd87-79f5-3839-086d-00000000007d 41684 1727204475.11753: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41684 1727204475.11828: no more pending results, returning what we have 41684 1727204475.11833: results queue empty 41684 1727204475.11834: checking for any_errors_fatal 41684 1727204475.11845: done checking for any_errors_fatal 41684 1727204475.11845: checking for max_fail_percentage 41684 1727204475.11847: done checking for max_fail_percentage 41684 1727204475.11848: checking to see if all hosts have failed and the running result is not ok 41684 1727204475.11849: done checking to see if all hosts have failed 41684 1727204475.11849: getting the remaining hosts for this loop 41684 1727204475.11851: done getting the remaining hosts for this loop 41684 1727204475.11855: getting the next task for host managed-node1 41684 1727204475.11860: done getting next task for host managed-node1 41684 1727204475.11865: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41684 1727204475.11868: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204475.11886: getting variables 41684 1727204475.11887: in VariableManager get_vars() 41684 1727204475.11922: Calling all_inventory to load vars for managed-node1 41684 1727204475.11924: Calling groups_inventory to load vars for managed-node1 41684 1727204475.11926: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204475.11934: Calling all_plugins_play to load vars for managed-node1 41684 1727204475.11936: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204475.11938: Calling groups_plugins_play to load vars for managed-node1 41684 1727204475.13241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204475.15145: done with get_vars() 41684 1727204475.15174: done getting variables 41684 1727204475.15235: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:01:15 -0400 (0:00:00.050) 0:00:31.554 ***** 41684 1727204475.15273: entering _queue_task() for managed-node1/debug 41684 1727204475.15585: worker is 1 (out of 1 available) 41684 1727204475.15597: exiting _queue_task() for managed-node1/debug 41684 1727204475.15610: done queuing things up, now waiting for results queue to drain 41684 1727204475.15611: waiting for pending results... 41684 1727204475.15913: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41684 1727204475.16051: in run() - task 0affcd87-79f5-3839-086d-00000000007e 41684 1727204475.16080: variable 'ansible_search_path' from source: unknown 41684 1727204475.16088: variable 'ansible_search_path' from source: unknown 41684 1727204475.16134: calling self._execute() 41684 1727204475.16235: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204475.16246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204475.16260: variable 'omit' from source: magic vars 41684 1727204475.16651: variable 'ansible_distribution_major_version' from source: facts 41684 1727204475.16672: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204475.16684: variable 'omit' from source: magic vars 41684 1727204475.16748: variable 'omit' from source: magic vars 41684 1727204475.16791: variable 'omit' from source: magic vars 41684 1727204475.16843: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204475.16885: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204475.16912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204475.16940: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204475.16957: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204475.16994: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204475.17003: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204475.17010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204475.17118: Set connection var ansible_connection to ssh 41684 1727204475.17131: Set connection var ansible_pipelining to False 41684 1727204475.17146: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204475.17157: Set connection var ansible_timeout to 10 41684 1727204475.17172: Set connection var ansible_shell_executable to /bin/sh 41684 1727204475.17179: Set connection var ansible_shell_type to sh 41684 1727204475.17208: variable 'ansible_shell_executable' from source: unknown 41684 1727204475.17216: variable 'ansible_connection' from source: unknown 41684 1727204475.17222: variable 'ansible_module_compression' from source: unknown 41684 1727204475.17229: variable 'ansible_shell_type' from source: unknown 41684 1727204475.17234: variable 'ansible_shell_executable' from source: unknown 41684 1727204475.17241: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204475.17253: variable 'ansible_pipelining' from source: unknown 41684 1727204475.17259: variable 'ansible_timeout' from source: unknown 41684 1727204475.17268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204475.17417: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204475.17435: variable 'omit' from source: magic vars 41684 1727204475.17445: starting attempt loop 41684 1727204475.17452: running the handler 41684 1727204475.17592: variable '__network_connections_result' from source: set_fact 41684 1727204475.17649: handler run complete 41684 1727204475.17676: attempt loop complete, returning result 41684 1727204475.17689: _execute() done 41684 1727204475.17692: dumping result to json 41684 1727204475.17695: done dumping result, returning 41684 1727204475.17703: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-3839-086d-00000000007e] 41684 1727204475.17708: sending task result for task 0affcd87-79f5-3839-086d-00000000007e 41684 1727204475.17802: done sending task result for task 0affcd87-79f5-3839-086d-00000000007e 41684 1727204475.17805: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 7d6131f1-a08f-4727-b007-3042c5fbcd66" ] } 41684 1727204475.17866: no more pending results, returning what we have 41684 1727204475.17872: results queue empty 41684 1727204475.17874: checking for any_errors_fatal 41684 1727204475.17879: done checking for any_errors_fatal 41684 1727204475.17880: checking for max_fail_percentage 41684 1727204475.17882: done checking for max_fail_percentage 41684 1727204475.17883: checking to see if all hosts have failed and the running result is not ok 41684 1727204475.17883: done checking to see if all hosts have failed 41684 1727204475.17884: getting the remaining hosts for this loop 41684 1727204475.17886: done getting the remaining hosts for this loop 41684 1727204475.17890: getting the next task for host managed-node1 41684 1727204475.17897: done getting next task for host managed-node1 41684 1727204475.17901: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41684 1727204475.17904: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204475.17916: getting variables 41684 1727204475.17918: in VariableManager get_vars() 41684 1727204475.17954: Calling all_inventory to load vars for managed-node1 41684 1727204475.17956: Calling groups_inventory to load vars for managed-node1 41684 1727204475.17958: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204475.17970: Calling all_plugins_play to load vars for managed-node1 41684 1727204475.17972: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204475.17975: Calling groups_plugins_play to load vars for managed-node1 41684 1727204475.19334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204475.20989: done with get_vars() 41684 1727204475.21014: done getting variables 41684 1727204475.21078: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:01:15 -0400 (0:00:00.058) 0:00:31.612 ***** 41684 1727204475.21114: entering _queue_task() for managed-node1/debug 41684 1727204475.21425: worker is 1 (out of 1 available) 41684 1727204475.21438: exiting _queue_task() for managed-node1/debug 41684 1727204475.21451: done queuing things up, now waiting for results queue to drain 41684 1727204475.21453: waiting for pending results... 41684 1727204475.21754: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41684 1727204475.21892: in run() - task 0affcd87-79f5-3839-086d-00000000007f 41684 1727204475.21915: variable 'ansible_search_path' from source: unknown 41684 1727204475.21923: variable 'ansible_search_path' from source: unknown 41684 1727204475.21966: calling self._execute() 41684 1727204475.22069: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204475.22083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204475.22098: variable 'omit' from source: magic vars 41684 1727204475.22489: variable 'ansible_distribution_major_version' from source: facts 41684 1727204475.22509: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204475.22521: variable 'omit' from source: magic vars 41684 1727204475.22587: variable 'omit' from source: magic vars 41684 1727204475.22630: variable 'omit' from source: magic vars 41684 1727204475.22683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204475.22724: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204475.22752: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204475.22782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204475.22799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204475.22832: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204475.22843: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204475.22851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204475.22960: Set connection var ansible_connection to ssh 41684 1727204475.22976: Set connection var ansible_pipelining to False 41684 1727204475.22991: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204475.23002: Set connection var ansible_timeout to 10 41684 1727204475.23015: Set connection var ansible_shell_executable to /bin/sh 41684 1727204475.23023: Set connection var ansible_shell_type to sh 41684 1727204475.23051: variable 'ansible_shell_executable' from source: unknown 41684 1727204475.23059: variable 'ansible_connection' from source: unknown 41684 1727204475.23068: variable 'ansible_module_compression' from source: unknown 41684 1727204475.23076: variable 'ansible_shell_type' from source: unknown 41684 1727204475.23082: variable 'ansible_shell_executable' from source: unknown 41684 1727204475.23089: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204475.23101: variable 'ansible_pipelining' from source: unknown 41684 1727204475.23107: variable 'ansible_timeout' from source: unknown 41684 1727204475.23114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204475.23262: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204475.23281: variable 'omit' from source: magic vars 41684 1727204475.23292: starting attempt loop 41684 1727204475.23299: running the handler 41684 1727204475.23354: variable '__network_connections_result' from source: set_fact 41684 1727204475.23444: variable '__network_connections_result' from source: set_fact 41684 1727204475.23581: handler run complete 41684 1727204475.23616: attempt loop complete, returning result 41684 1727204475.23624: _execute() done 41684 1727204475.23634: dumping result to json 41684 1727204475.23646: done dumping result, returning 41684 1727204475.23660: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-3839-086d-00000000007f] 41684 1727204475.23674: sending task result for task 0affcd87-79f5-3839-086d-00000000007f ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "address": [ "198.51.100.4/24", "2001:db8::6/32" ], "route": [ { "gateway": "198.51.100.102", "metric": 4, "network": "198.58.10.64", "prefix": 26 } ] }, "mac": "96:07:24:63:96:ac", "name": "ethtest1", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 7d6131f1-a08f-4727-b007-3042c5fbcd66\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 7d6131f1-a08f-4727-b007-3042c5fbcd66" ] } } 41684 1727204475.23885: no more pending results, returning what we have 41684 1727204475.23889: results queue empty 41684 1727204475.23891: checking for any_errors_fatal 41684 1727204475.23898: done checking for any_errors_fatal 41684 1727204475.23899: checking for max_fail_percentage 41684 1727204475.23900: done checking for max_fail_percentage 41684 1727204475.23902: checking to see if all hosts have failed and the running result is not ok 41684 1727204475.23902: done checking to see if all hosts have failed 41684 1727204475.23903: getting the remaining hosts for this loop 41684 1727204475.23905: done getting the remaining hosts for this loop 41684 1727204475.23910: getting the next task for host managed-node1 41684 1727204475.23917: done getting next task for host managed-node1 41684 1727204475.23921: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41684 1727204475.23925: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204475.23937: getting variables 41684 1727204475.23939: in VariableManager get_vars() 41684 1727204475.23982: Calling all_inventory to load vars for managed-node1 41684 1727204475.23986: Calling groups_inventory to load vars for managed-node1 41684 1727204475.23988: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204475.24006: Calling all_plugins_play to load vars for managed-node1 41684 1727204475.24009: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204475.24012: Calling groups_plugins_play to load vars for managed-node1 41684 1727204475.24984: done sending task result for task 0affcd87-79f5-3839-086d-00000000007f 41684 1727204475.24988: WORKER PROCESS EXITING 41684 1727204475.25881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204475.26833: done with get_vars() 41684 1727204475.26851: done getting variables 41684 1727204475.26899: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:01:15 -0400 (0:00:00.058) 0:00:31.670 ***** 41684 1727204475.26925: entering _queue_task() for managed-node1/debug 41684 1727204475.27143: worker is 1 (out of 1 available) 41684 1727204475.27156: exiting _queue_task() for managed-node1/debug 41684 1727204475.27172: done queuing things up, now waiting for results queue to drain 41684 1727204475.27173: waiting for pending results... 41684 1727204475.27388: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41684 1727204475.27482: in run() - task 0affcd87-79f5-3839-086d-000000000080 41684 1727204475.27493: variable 'ansible_search_path' from source: unknown 41684 1727204475.27497: variable 'ansible_search_path' from source: unknown 41684 1727204475.27526: calling self._execute() 41684 1727204475.27627: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204475.27632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204475.27635: variable 'omit' from source: magic vars 41684 1727204475.27969: variable 'ansible_distribution_major_version' from source: facts 41684 1727204475.27972: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204475.28173: variable 'network_state' from source: role '' defaults 41684 1727204475.28176: Evaluated conditional (network_state != {}): False 41684 1727204475.28178: when evaluation is False, skipping this task 41684 1727204475.28181: _execute() done 41684 1727204475.28182: dumping result to json 41684 1727204475.28184: done dumping result, returning 41684 1727204475.28186: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-3839-086d-000000000080] 41684 1727204475.28188: sending task result for task 0affcd87-79f5-3839-086d-000000000080 41684 1727204475.28251: done sending task result for task 0affcd87-79f5-3839-086d-000000000080 41684 1727204475.28254: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 41684 1727204475.28319: no more pending results, returning what we have 41684 1727204475.28323: results queue empty 41684 1727204475.28323: checking for any_errors_fatal 41684 1727204475.28330: done checking for any_errors_fatal 41684 1727204475.28330: checking for max_fail_percentage 41684 1727204475.28332: done checking for max_fail_percentage 41684 1727204475.28333: checking to see if all hosts have failed and the running result is not ok 41684 1727204475.28334: done checking to see if all hosts have failed 41684 1727204475.28334: getting the remaining hosts for this loop 41684 1727204475.28335: done getting the remaining hosts for this loop 41684 1727204475.28337: getting the next task for host managed-node1 41684 1727204475.28341: done getting next task for host managed-node1 41684 1727204475.28344: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 41684 1727204475.28346: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204475.28357: getting variables 41684 1727204475.28358: in VariableManager get_vars() 41684 1727204475.28394: Calling all_inventory to load vars for managed-node1 41684 1727204475.28397: Calling groups_inventory to load vars for managed-node1 41684 1727204475.28398: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204475.28404: Calling all_plugins_play to load vars for managed-node1 41684 1727204475.28406: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204475.28408: Calling groups_plugins_play to load vars for managed-node1 41684 1727204475.29785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204475.32346: done with get_vars() 41684 1727204475.32369: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:01:15 -0400 (0:00:00.055) 0:00:31.725 ***** 41684 1727204475.32440: entering _queue_task() for managed-node1/ping 41684 1727204475.32740: worker is 1 (out of 1 available) 41684 1727204475.32753: exiting _queue_task() for managed-node1/ping 41684 1727204475.32767: done queuing things up, now waiting for results queue to drain 41684 1727204475.32768: waiting for pending results... 41684 1727204475.33074: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 41684 1727204475.33214: in run() - task 0affcd87-79f5-3839-086d-000000000081 41684 1727204475.33235: variable 'ansible_search_path' from source: unknown 41684 1727204475.33243: variable 'ansible_search_path' from source: unknown 41684 1727204475.33286: calling self._execute() 41684 1727204475.33386: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204475.33397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204475.33412: variable 'omit' from source: magic vars 41684 1727204475.33786: variable 'ansible_distribution_major_version' from source: facts 41684 1727204475.33803: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204475.33812: variable 'omit' from source: magic vars 41684 1727204475.33871: variable 'omit' from source: magic vars 41684 1727204475.33908: variable 'omit' from source: magic vars 41684 1727204475.33952: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204475.33996: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204475.34021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204475.34041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204475.34057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204475.34096: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204475.34104: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204475.34111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204475.34216: Set connection var ansible_connection to ssh 41684 1727204475.34227: Set connection var ansible_pipelining to False 41684 1727204475.34239: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204475.34251: Set connection var ansible_timeout to 10 41684 1727204475.34263: Set connection var ansible_shell_executable to /bin/sh 41684 1727204475.34273: Set connection var ansible_shell_type to sh 41684 1727204475.34313: variable 'ansible_shell_executable' from source: unknown 41684 1727204475.34321: variable 'ansible_connection' from source: unknown 41684 1727204475.34328: variable 'ansible_module_compression' from source: unknown 41684 1727204475.34334: variable 'ansible_shell_type' from source: unknown 41684 1727204475.34340: variable 'ansible_shell_executable' from source: unknown 41684 1727204475.34345: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204475.34351: variable 'ansible_pipelining' from source: unknown 41684 1727204475.34356: variable 'ansible_timeout' from source: unknown 41684 1727204475.34361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204475.34561: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204475.34579: variable 'omit' from source: magic vars 41684 1727204475.34588: starting attempt loop 41684 1727204475.34594: running the handler 41684 1727204475.34610: _low_level_execute_command(): starting 41684 1727204475.34625: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204475.35374: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204475.35392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204475.35406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204475.35422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204475.35482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204475.35498: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204475.35511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204475.35528: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204475.35538: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204475.35547: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204475.35562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204475.35579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204475.35598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204475.35611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204475.35622: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204475.35635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204475.35713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204475.35729: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204475.35743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204475.35893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204475.37613: stdout chunk (state=3): >>>/root <<< 41684 1727204475.37813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204475.37816: stdout chunk (state=3): >>><<< 41684 1727204475.37818: stderr chunk (state=3): >>><<< 41684 1727204475.37926: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204475.37930: _low_level_execute_command(): starting 41684 1727204475.37933: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204475.3783731-44154-14293003272237 `" && echo ansible-tmp-1727204475.3783731-44154-14293003272237="` echo /root/.ansible/tmp/ansible-tmp-1727204475.3783731-44154-14293003272237 `" ) && sleep 0' 41684 1727204475.38938: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204475.38942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204475.38989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204475.38992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41684 1727204475.38995: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204475.38997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204475.39053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204475.39175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204475.39279: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204475.39448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204475.41288: stdout chunk (state=3): >>>ansible-tmp-1727204475.3783731-44154-14293003272237=/root/.ansible/tmp/ansible-tmp-1727204475.3783731-44154-14293003272237 <<< 41684 1727204475.41409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204475.41497: stderr chunk (state=3): >>><<< 41684 1727204475.41500: stdout chunk (state=3): >>><<< 41684 1727204475.41569: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204475.3783731-44154-14293003272237=/root/.ansible/tmp/ansible-tmp-1727204475.3783731-44154-14293003272237 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204475.41803: variable 'ansible_module_compression' from source: unknown 41684 1727204475.41807: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 41684 1727204475.41809: variable 'ansible_facts' from source: unknown 41684 1727204475.41811: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204475.3783731-44154-14293003272237/AnsiballZ_ping.py 41684 1727204475.41917: Sending initial data 41684 1727204475.41920: Sent initial data (152 bytes) 41684 1727204475.42848: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204475.42863: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204475.42880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204475.42897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204475.42937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204475.42950: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204475.42967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204475.42987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204475.43000: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204475.43011: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204475.43023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204475.43038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204475.43056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204475.43074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204475.43086: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204475.43099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204475.43175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204475.43192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204475.43207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204475.43947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204475.45643: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204475.45694: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204475.45748: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpnky4hbf7 /root/.ansible/tmp/ansible-tmp-1727204475.3783731-44154-14293003272237/AnsiballZ_ping.py <<< 41684 1727204475.45801: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204475.47082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204475.47271: stderr chunk (state=3): >>><<< 41684 1727204475.47275: stdout chunk (state=3): >>><<< 41684 1727204475.47277: done transferring module to remote 41684 1727204475.47279: _low_level_execute_command(): starting 41684 1727204475.47281: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204475.3783731-44154-14293003272237/ /root/.ansible/tmp/ansible-tmp-1727204475.3783731-44154-14293003272237/AnsiballZ_ping.py && sleep 0' 41684 1727204475.48737: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204475.48750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204475.49382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204475.49407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204475.49451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204475.49470: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204475.49483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204475.49500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204475.49511: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204475.49521: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204475.49531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204475.49543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204475.49557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204475.49574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204475.49586: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204475.49599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204475.49679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204475.49698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204475.49712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204475.49796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204475.51594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204475.51598: stdout chunk (state=3): >>><<< 41684 1727204475.51601: stderr chunk (state=3): >>><<< 41684 1727204475.51701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204475.51705: _low_level_execute_command(): starting 41684 1727204475.51707: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204475.3783731-44154-14293003272237/AnsiballZ_ping.py && sleep 0' 41684 1727204475.53233: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204475.53251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204475.53267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204475.53285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204475.53329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204475.53344: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204475.53361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204475.53382: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204475.53436: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204475.53454: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204475.53471: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204475.53486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204475.53500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204475.53511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204475.53521: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204475.53533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204475.53616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204475.53635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204475.53652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204475.53792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204475.66544: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 41684 1727204475.67520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204475.67524: stdout chunk (state=3): >>><<< 41684 1727204475.67526: stderr chunk (state=3): >>><<< 41684 1727204475.67659: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204475.67667: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204475.3783731-44154-14293003272237/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204475.67677: _low_level_execute_command(): starting 41684 1727204475.67679: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204475.3783731-44154-14293003272237/ > /dev/null 2>&1 && sleep 0' 41684 1727204475.69096: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204475.69100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204475.69136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204475.69140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204475.69142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204475.69145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204475.69324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204475.69388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204475.69602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204475.71379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204475.71383: stdout chunk (state=3): >>><<< 41684 1727204475.71389: stderr chunk (state=3): >>><<< 41684 1727204475.71408: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204475.71415: handler run complete 41684 1727204475.71431: attempt loop complete, returning result 41684 1727204475.71433: _execute() done 41684 1727204475.71436: dumping result to json 41684 1727204475.71438: done dumping result, returning 41684 1727204475.71448: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-3839-086d-000000000081] 41684 1727204475.71453: sending task result for task 0affcd87-79f5-3839-086d-000000000081 41684 1727204475.71549: done sending task result for task 0affcd87-79f5-3839-086d-000000000081 41684 1727204475.71552: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 41684 1727204475.71614: no more pending results, returning what we have 41684 1727204475.71617: results queue empty 41684 1727204475.71618: checking for any_errors_fatal 41684 1727204475.71624: done checking for any_errors_fatal 41684 1727204475.71625: checking for max_fail_percentage 41684 1727204475.71626: done checking for max_fail_percentage 41684 1727204475.71627: checking to see if all hosts have failed and the running result is not ok 41684 1727204475.71628: done checking to see if all hosts have failed 41684 1727204475.71629: getting the remaining hosts for this loop 41684 1727204475.71631: done getting the remaining hosts for this loop 41684 1727204475.71634: getting the next task for host managed-node1 41684 1727204475.71644: done getting next task for host managed-node1 41684 1727204475.71646: ^ task is: TASK: meta (role_complete) 41684 1727204475.71649: ^ state is: HOST STATE: block=3, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204475.71659: getting variables 41684 1727204475.71661: in VariableManager get_vars() 41684 1727204475.71707: Calling all_inventory to load vars for managed-node1 41684 1727204475.71710: Calling groups_inventory to load vars for managed-node1 41684 1727204475.71713: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204475.71723: Calling all_plugins_play to load vars for managed-node1 41684 1727204475.71725: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204475.71727: Calling groups_plugins_play to load vars for managed-node1 41684 1727204475.75735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204475.79133: done with get_vars() 41684 1727204475.79165: done getting variables 41684 1727204475.79251: done queuing things up, now waiting for results queue to drain 41684 1727204475.79253: results queue empty 41684 1727204475.79254: checking for any_errors_fatal 41684 1727204475.79257: done checking for any_errors_fatal 41684 1727204475.79258: checking for max_fail_percentage 41684 1727204475.79259: done checking for max_fail_percentage 41684 1727204475.79260: checking to see if all hosts have failed and the running result is not ok 41684 1727204475.79260: done checking to see if all hosts have failed 41684 1727204475.79261: getting the remaining hosts for this loop 41684 1727204475.79262: done getting the remaining hosts for this loop 41684 1727204475.79266: getting the next task for host managed-node1 41684 1727204475.79271: done getting next task for host managed-node1 41684 1727204475.79273: ^ task is: TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider 41684 1727204475.79274: ^ state is: HOST STATE: block=3, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204475.79277: getting variables 41684 1727204475.79278: in VariableManager get_vars() 41684 1727204475.79293: Calling all_inventory to load vars for managed-node1 41684 1727204475.79296: Calling groups_inventory to load vars for managed-node1 41684 1727204475.79298: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204475.79303: Calling all_plugins_play to load vars for managed-node1 41684 1727204475.79306: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204475.79308: Calling groups_plugins_play to load vars for managed-node1 41684 1727204475.80580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204475.83982: done with get_vars() 41684 1727204475.84009: done getting variables 41684 1727204475.84280: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the warning about specifying the route without the output device is logged for initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:122 Tuesday 24 September 2024 15:01:15 -0400 (0:00:00.518) 0:00:32.244 ***** 41684 1727204475.84341: entering _queue_task() for managed-node1/assert 41684 1727204475.84694: worker is 1 (out of 1 available) 41684 1727204475.84705: exiting _queue_task() for managed-node1/assert 41684 1727204475.84717: done queuing things up, now waiting for results queue to drain 41684 1727204475.84718: waiting for pending results... 41684 1727204475.85034: running TaskExecutor() for managed-node1/TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider 41684 1727204475.85151: in run() - task 0affcd87-79f5-3839-086d-0000000000b1 41684 1727204475.85180: variable 'ansible_search_path' from source: unknown 41684 1727204475.85223: calling self._execute() 41684 1727204475.85331: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204475.85344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204475.85361: variable 'omit' from source: magic vars 41684 1727204475.85756: variable 'ansible_distribution_major_version' from source: facts 41684 1727204475.85778: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204475.85903: variable 'network_provider' from source: set_fact 41684 1727204475.85915: Evaluated conditional (network_provider == "initscripts"): False 41684 1727204475.85927: when evaluation is False, skipping this task 41684 1727204475.85935: _execute() done 41684 1727204475.85944: dumping result to json 41684 1727204475.85952: done dumping result, returning 41684 1727204475.85962: done running TaskExecutor() for managed-node1/TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider [0affcd87-79f5-3839-086d-0000000000b1] 41684 1727204475.85977: sending task result for task 0affcd87-79f5-3839-086d-0000000000b1 skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 41684 1727204475.86299: no more pending results, returning what we have 41684 1727204475.86303: results queue empty 41684 1727204475.86304: checking for any_errors_fatal 41684 1727204475.86307: done checking for any_errors_fatal 41684 1727204475.86307: checking for max_fail_percentage 41684 1727204475.86309: done checking for max_fail_percentage 41684 1727204475.86310: checking to see if all hosts have failed and the running result is not ok 41684 1727204475.86311: done checking to see if all hosts have failed 41684 1727204475.86311: getting the remaining hosts for this loop 41684 1727204475.86314: done getting the remaining hosts for this loop 41684 1727204475.86318: getting the next task for host managed-node1 41684 1727204475.86325: done getting next task for host managed-node1 41684 1727204475.86328: ^ task is: TASK: Assert that no warning is logged for nm provider 41684 1727204475.86331: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204475.86334: getting variables 41684 1727204475.86336: in VariableManager get_vars() 41684 1727204475.86385: Calling all_inventory to load vars for managed-node1 41684 1727204475.86389: Calling groups_inventory to load vars for managed-node1 41684 1727204475.86391: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204475.86406: Calling all_plugins_play to load vars for managed-node1 41684 1727204475.86409: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204475.86412: Calling groups_plugins_play to load vars for managed-node1 41684 1727204475.87786: done sending task result for task 0affcd87-79f5-3839-086d-0000000000b1 41684 1727204475.87790: WORKER PROCESS EXITING 41684 1727204475.88178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204475.89951: done with get_vars() 41684 1727204475.89978: done getting variables 41684 1727204475.90024: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that no warning is logged for nm provider] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:129 Tuesday 24 September 2024 15:01:15 -0400 (0:00:00.057) 0:00:32.302 ***** 41684 1727204475.90048: entering _queue_task() for managed-node1/assert 41684 1727204475.90296: worker is 1 (out of 1 available) 41684 1727204475.90311: exiting _queue_task() for managed-node1/assert 41684 1727204475.90324: done queuing things up, now waiting for results queue to drain 41684 1727204475.90325: waiting for pending results... 41684 1727204475.90527: running TaskExecutor() for managed-node1/TASK: Assert that no warning is logged for nm provider 41684 1727204475.90590: in run() - task 0affcd87-79f5-3839-086d-0000000000b2 41684 1727204475.90603: variable 'ansible_search_path' from source: unknown 41684 1727204475.90634: calling self._execute() 41684 1727204475.90714: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204475.90718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204475.90726: variable 'omit' from source: magic vars 41684 1727204475.91020: variable 'ansible_distribution_major_version' from source: facts 41684 1727204475.91053: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204475.91374: variable 'network_provider' from source: set_fact 41684 1727204475.91377: Evaluated conditional (network_provider == "nm"): True 41684 1727204475.91380: variable 'omit' from source: magic vars 41684 1727204475.91382: variable 'omit' from source: magic vars 41684 1727204475.91384: variable 'omit' from source: magic vars 41684 1727204475.91386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204475.91389: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204475.91391: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204475.91393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204475.91426: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204475.91435: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204475.91439: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204475.91442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204475.91548: Set connection var ansible_connection to ssh 41684 1727204475.91552: Set connection var ansible_pipelining to False 41684 1727204475.91555: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204475.91573: Set connection var ansible_timeout to 10 41684 1727204475.91577: Set connection var ansible_shell_executable to /bin/sh 41684 1727204475.91579: Set connection var ansible_shell_type to sh 41684 1727204475.91606: variable 'ansible_shell_executable' from source: unknown 41684 1727204475.91609: variable 'ansible_connection' from source: unknown 41684 1727204475.91612: variable 'ansible_module_compression' from source: unknown 41684 1727204475.91614: variable 'ansible_shell_type' from source: unknown 41684 1727204475.91616: variable 'ansible_shell_executable' from source: unknown 41684 1727204475.91618: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204475.91621: variable 'ansible_pipelining' from source: unknown 41684 1727204475.91623: variable 'ansible_timeout' from source: unknown 41684 1727204475.91628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204475.91773: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204475.91786: variable 'omit' from source: magic vars 41684 1727204475.91791: starting attempt loop 41684 1727204475.91794: running the handler 41684 1727204475.91958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204475.92196: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204475.92238: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204475.92357: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204475.92422: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204475.92558: variable '__network_connections_result' from source: set_fact 41684 1727204475.92599: Evaluated conditional (__network_connections_result.stderr is not search("")): True 41684 1727204475.92611: handler run complete 41684 1727204475.92631: attempt loop complete, returning result 41684 1727204475.92634: _execute() done 41684 1727204475.92637: dumping result to json 41684 1727204475.92639: done dumping result, returning 41684 1727204475.92649: done running TaskExecutor() for managed-node1/TASK: Assert that no warning is logged for nm provider [0affcd87-79f5-3839-086d-0000000000b2] 41684 1727204475.92651: sending task result for task 0affcd87-79f5-3839-086d-0000000000b2 41684 1727204475.92777: done sending task result for task 0affcd87-79f5-3839-086d-0000000000b2 41684 1727204475.92780: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 41684 1727204475.92853: no more pending results, returning what we have 41684 1727204475.92858: results queue empty 41684 1727204475.92859: checking for any_errors_fatal 41684 1727204475.92873: done checking for any_errors_fatal 41684 1727204475.92875: checking for max_fail_percentage 41684 1727204475.92876: done checking for max_fail_percentage 41684 1727204475.92877: checking to see if all hosts have failed and the running result is not ok 41684 1727204475.92878: done checking to see if all hosts have failed 41684 1727204475.92879: getting the remaining hosts for this loop 41684 1727204475.92881: done getting the remaining hosts for this loop 41684 1727204475.92885: getting the next task for host managed-node1 41684 1727204475.92894: done getting next task for host managed-node1 41684 1727204475.92898: ^ task is: TASK: Bring down test devices and profiles 41684 1727204475.92902: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204475.92906: getting variables 41684 1727204475.92908: in VariableManager get_vars() 41684 1727204475.92948: Calling all_inventory to load vars for managed-node1 41684 1727204475.92950: Calling groups_inventory to load vars for managed-node1 41684 1727204475.92952: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204475.92966: Calling all_plugins_play to load vars for managed-node1 41684 1727204475.92969: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204475.92972: Calling groups_plugins_play to load vars for managed-node1 41684 1727204475.99079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204476.01206: done with get_vars() 41684 1727204476.01237: done getting variables TASK [Bring down test devices and profiles] ************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:140 Tuesday 24 September 2024 15:01:16 -0400 (0:00:00.112) 0:00:32.414 ***** 41684 1727204476.01344: entering _queue_task() for managed-node1/include_role 41684 1727204476.01346: Creating lock for include_role 41684 1727204476.02314: worker is 1 (out of 1 available) 41684 1727204476.02326: exiting _queue_task() for managed-node1/include_role 41684 1727204476.02338: done queuing things up, now waiting for results queue to drain 41684 1727204476.02340: waiting for pending results... 41684 1727204476.02898: running TaskExecutor() for managed-node1/TASK: Bring down test devices and profiles 41684 1727204476.03019: in run() - task 0affcd87-79f5-3839-086d-0000000000b4 41684 1727204476.03034: variable 'ansible_search_path' from source: unknown 41684 1727204476.03141: calling self._execute() 41684 1727204476.03295: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204476.03316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204476.03333: variable 'omit' from source: magic vars 41684 1727204476.03876: variable 'ansible_distribution_major_version' from source: facts 41684 1727204476.03901: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204476.03913: _execute() done 41684 1727204476.03922: dumping result to json 41684 1727204476.03930: done dumping result, returning 41684 1727204476.03940: done running TaskExecutor() for managed-node1/TASK: Bring down test devices and profiles [0affcd87-79f5-3839-086d-0000000000b4] 41684 1727204476.03970: sending task result for task 0affcd87-79f5-3839-086d-0000000000b4 41684 1727204476.04148: no more pending results, returning what we have 41684 1727204476.04154: in VariableManager get_vars() 41684 1727204476.04208: Calling all_inventory to load vars for managed-node1 41684 1727204476.04212: Calling groups_inventory to load vars for managed-node1 41684 1727204476.04214: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204476.04229: Calling all_plugins_play to load vars for managed-node1 41684 1727204476.04233: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204476.04236: Calling groups_plugins_play to load vars for managed-node1 41684 1727204476.05671: done sending task result for task 0affcd87-79f5-3839-086d-0000000000b4 41684 1727204476.05675: WORKER PROCESS EXITING 41684 1727204476.07605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204476.09593: done with get_vars() 41684 1727204476.09617: variable 'ansible_search_path' from source: unknown 41684 1727204476.09881: variable 'omit' from source: magic vars 41684 1727204476.09914: variable 'omit' from source: magic vars 41684 1727204476.09929: variable 'omit' from source: magic vars 41684 1727204476.09933: we have included files to process 41684 1727204476.09933: generating all_blocks data 41684 1727204476.09936: done generating all_blocks data 41684 1727204476.09941: processing included file: fedora.linux_system_roles.network 41684 1727204476.09961: in VariableManager get_vars() 41684 1727204476.09984: done with get_vars() 41684 1727204476.10012: in VariableManager get_vars() 41684 1727204476.10031: done with get_vars() 41684 1727204476.10077: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 41684 1727204476.10201: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 41684 1727204476.10287: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 41684 1727204476.10741: in VariableManager get_vars() 41684 1727204476.10768: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41684 1727204476.13645: iterating over new_blocks loaded from include file 41684 1727204476.13648: in VariableManager get_vars() 41684 1727204476.13672: done with get_vars() 41684 1727204476.13674: filtering new block on tags 41684 1727204476.13978: done filtering new block on tags 41684 1727204476.13982: in VariableManager get_vars() 41684 1727204476.14000: done with get_vars() 41684 1727204476.14002: filtering new block on tags 41684 1727204476.14018: done filtering new block on tags 41684 1727204476.14020: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node1 41684 1727204476.14026: extending task lists for all hosts with included blocks 41684 1727204476.14262: done extending task lists 41684 1727204476.14265: done processing included files 41684 1727204476.14267: results queue empty 41684 1727204476.14267: checking for any_errors_fatal 41684 1727204476.14271: done checking for any_errors_fatal 41684 1727204476.14272: checking for max_fail_percentage 41684 1727204476.14273: done checking for max_fail_percentage 41684 1727204476.14274: checking to see if all hosts have failed and the running result is not ok 41684 1727204476.14275: done checking to see if all hosts have failed 41684 1727204476.14276: getting the remaining hosts for this loop 41684 1727204476.14277: done getting the remaining hosts for this loop 41684 1727204476.14279: getting the next task for host managed-node1 41684 1727204476.14284: done getting next task for host managed-node1 41684 1727204476.14287: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41684 1727204476.14290: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204476.14300: getting variables 41684 1727204476.14301: in VariableManager get_vars() 41684 1727204476.14316: Calling all_inventory to load vars for managed-node1 41684 1727204476.14319: Calling groups_inventory to load vars for managed-node1 41684 1727204476.14321: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204476.14326: Calling all_plugins_play to load vars for managed-node1 41684 1727204476.14328: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204476.14331: Calling groups_plugins_play to load vars for managed-node1 41684 1727204476.16122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204476.18080: done with get_vars() 41684 1727204476.18114: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:01:16 -0400 (0:00:00.168) 0:00:32.583 ***** 41684 1727204476.18198: entering _queue_task() for managed-node1/include_tasks 41684 1727204476.18544: worker is 1 (out of 1 available) 41684 1727204476.18559: exiting _queue_task() for managed-node1/include_tasks 41684 1727204476.18574: done queuing things up, now waiting for results queue to drain 41684 1727204476.18575: waiting for pending results... 41684 1727204476.18881: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41684 1727204476.19001: in run() - task 0affcd87-79f5-3839-086d-000000000641 41684 1727204476.19016: variable 'ansible_search_path' from source: unknown 41684 1727204476.19020: variable 'ansible_search_path' from source: unknown 41684 1727204476.19065: calling self._execute() 41684 1727204476.19168: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204476.19174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204476.19177: variable 'omit' from source: magic vars 41684 1727204476.19567: variable 'ansible_distribution_major_version' from source: facts 41684 1727204476.19584: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204476.19590: _execute() done 41684 1727204476.19593: dumping result to json 41684 1727204476.19595: done dumping result, returning 41684 1727204476.19603: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-3839-086d-000000000641] 41684 1727204476.19609: sending task result for task 0affcd87-79f5-3839-086d-000000000641 41684 1727204476.19740: done sending task result for task 0affcd87-79f5-3839-086d-000000000641 41684 1727204476.19765: no more pending results, returning what we have 41684 1727204476.19772: in VariableManager get_vars() 41684 1727204476.19828: Calling all_inventory to load vars for managed-node1 41684 1727204476.19831: Calling groups_inventory to load vars for managed-node1 41684 1727204476.19834: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204476.19841: WORKER PROCESS EXITING 41684 1727204476.19856: Calling all_plugins_play to load vars for managed-node1 41684 1727204476.19860: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204476.19865: Calling groups_plugins_play to load vars for managed-node1 41684 1727204476.21686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204476.23400: done with get_vars() 41684 1727204476.23423: variable 'ansible_search_path' from source: unknown 41684 1727204476.23424: variable 'ansible_search_path' from source: unknown 41684 1727204476.23469: we have included files to process 41684 1727204476.23471: generating all_blocks data 41684 1727204476.23472: done generating all_blocks data 41684 1727204476.23475: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41684 1727204476.23476: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41684 1727204476.23478: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41684 1727204476.24043: done processing included file 41684 1727204476.24045: iterating over new_blocks loaded from include file 41684 1727204476.24047: in VariableManager get_vars() 41684 1727204476.24076: done with get_vars() 41684 1727204476.24079: filtering new block on tags 41684 1727204476.24107: done filtering new block on tags 41684 1727204476.24110: in VariableManager get_vars() 41684 1727204476.24133: done with get_vars() 41684 1727204476.24134: filtering new block on tags 41684 1727204476.24175: done filtering new block on tags 41684 1727204476.24177: in VariableManager get_vars() 41684 1727204476.24205: done with get_vars() 41684 1727204476.24207: filtering new block on tags 41684 1727204476.24244: done filtering new block on tags 41684 1727204476.24246: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 41684 1727204476.24252: extending task lists for all hosts with included blocks 41684 1727204476.25306: done extending task lists 41684 1727204476.25307: done processing included files 41684 1727204476.25308: results queue empty 41684 1727204476.25309: checking for any_errors_fatal 41684 1727204476.25312: done checking for any_errors_fatal 41684 1727204476.25313: checking for max_fail_percentage 41684 1727204476.25314: done checking for max_fail_percentage 41684 1727204476.25315: checking to see if all hosts have failed and the running result is not ok 41684 1727204476.25316: done checking to see if all hosts have failed 41684 1727204476.25317: getting the remaining hosts for this loop 41684 1727204476.25318: done getting the remaining hosts for this loop 41684 1727204476.25321: getting the next task for host managed-node1 41684 1727204476.25325: done getting next task for host managed-node1 41684 1727204476.25328: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41684 1727204476.25332: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204476.25342: getting variables 41684 1727204476.25344: in VariableManager get_vars() 41684 1727204476.25360: Calling all_inventory to load vars for managed-node1 41684 1727204476.25363: Calling groups_inventory to load vars for managed-node1 41684 1727204476.25366: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204476.25372: Calling all_plugins_play to load vars for managed-node1 41684 1727204476.25374: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204476.25377: Calling groups_plugins_play to load vars for managed-node1 41684 1727204476.26590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204476.28243: done with get_vars() 41684 1727204476.28273: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:01:16 -0400 (0:00:00.101) 0:00:32.685 ***** 41684 1727204476.28357: entering _queue_task() for managed-node1/setup 41684 1727204476.28698: worker is 1 (out of 1 available) 41684 1727204476.28710: exiting _queue_task() for managed-node1/setup 41684 1727204476.28723: done queuing things up, now waiting for results queue to drain 41684 1727204476.28724: waiting for pending results... 41684 1727204476.29016: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41684 1727204476.29148: in run() - task 0affcd87-79f5-3839-086d-0000000006a7 41684 1727204476.29161: variable 'ansible_search_path' from source: unknown 41684 1727204476.29171: variable 'ansible_search_path' from source: unknown 41684 1727204476.29209: calling self._execute() 41684 1727204476.29310: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204476.29314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204476.29323: variable 'omit' from source: magic vars 41684 1727204476.29699: variable 'ansible_distribution_major_version' from source: facts 41684 1727204476.29711: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204476.29935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204476.32349: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204476.32413: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204476.32455: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204476.32492: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204476.32519: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204476.32600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204476.32628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204476.32658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204476.32703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204476.32717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204476.32777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204476.32801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204476.32827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204476.32869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204476.32892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204476.33052: variable '__network_required_facts' from source: role '' defaults 41684 1727204476.33067: variable 'ansible_facts' from source: unknown 41684 1727204476.33828: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 41684 1727204476.33832: when evaluation is False, skipping this task 41684 1727204476.33835: _execute() done 41684 1727204476.33837: dumping result to json 41684 1727204476.33839: done dumping result, returning 41684 1727204476.33847: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-3839-086d-0000000006a7] 41684 1727204476.33859: sending task result for task 0affcd87-79f5-3839-086d-0000000006a7 41684 1727204476.33952: done sending task result for task 0affcd87-79f5-3839-086d-0000000006a7 41684 1727204476.33955: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41684 1727204476.34008: no more pending results, returning what we have 41684 1727204476.34012: results queue empty 41684 1727204476.34014: checking for any_errors_fatal 41684 1727204476.34016: done checking for any_errors_fatal 41684 1727204476.34017: checking for max_fail_percentage 41684 1727204476.34018: done checking for max_fail_percentage 41684 1727204476.34019: checking to see if all hosts have failed and the running result is not ok 41684 1727204476.34020: done checking to see if all hosts have failed 41684 1727204476.34021: getting the remaining hosts for this loop 41684 1727204476.34023: done getting the remaining hosts for this loop 41684 1727204476.34028: getting the next task for host managed-node1 41684 1727204476.34039: done getting next task for host managed-node1 41684 1727204476.34045: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 41684 1727204476.34051: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204476.34073: getting variables 41684 1727204476.34076: in VariableManager get_vars() 41684 1727204476.34127: Calling all_inventory to load vars for managed-node1 41684 1727204476.34131: Calling groups_inventory to load vars for managed-node1 41684 1727204476.34133: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204476.34145: Calling all_plugins_play to load vars for managed-node1 41684 1727204476.34149: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204476.34152: Calling groups_plugins_play to load vars for managed-node1 41684 1727204476.35690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204476.36640: done with get_vars() 41684 1727204476.36657: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:01:16 -0400 (0:00:00.083) 0:00:32.768 ***** 41684 1727204476.36737: entering _queue_task() for managed-node1/stat 41684 1727204476.36968: worker is 1 (out of 1 available) 41684 1727204476.36981: exiting _queue_task() for managed-node1/stat 41684 1727204476.36995: done queuing things up, now waiting for results queue to drain 41684 1727204476.36997: waiting for pending results... 41684 1727204476.37283: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 41684 1727204476.37430: in run() - task 0affcd87-79f5-3839-086d-0000000006a9 41684 1727204476.37434: variable 'ansible_search_path' from source: unknown 41684 1727204476.37437: variable 'ansible_search_path' from source: unknown 41684 1727204476.37440: calling self._execute() 41684 1727204476.37519: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204476.37523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204476.37533: variable 'omit' from source: magic vars 41684 1727204476.37921: variable 'ansible_distribution_major_version' from source: facts 41684 1727204476.37933: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204476.38104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204476.38377: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204476.38427: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204476.38456: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204476.38490: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204476.38591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204476.38631: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204476.38670: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204476.38691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204476.38766: variable '__network_is_ostree' from source: set_fact 41684 1727204476.38770: Evaluated conditional (not __network_is_ostree is defined): False 41684 1727204476.38773: when evaluation is False, skipping this task 41684 1727204476.38776: _execute() done 41684 1727204476.38783: dumping result to json 41684 1727204476.38786: done dumping result, returning 41684 1727204476.38792: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-3839-086d-0000000006a9] 41684 1727204476.38798: sending task result for task 0affcd87-79f5-3839-086d-0000000006a9 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41684 1727204476.38936: no more pending results, returning what we have 41684 1727204476.38940: results queue empty 41684 1727204476.38941: checking for any_errors_fatal 41684 1727204476.38948: done checking for any_errors_fatal 41684 1727204476.38949: checking for max_fail_percentage 41684 1727204476.38950: done checking for max_fail_percentage 41684 1727204476.38951: checking to see if all hosts have failed and the running result is not ok 41684 1727204476.38951: done checking to see if all hosts have failed 41684 1727204476.38952: getting the remaining hosts for this loop 41684 1727204476.38954: done getting the remaining hosts for this loop 41684 1727204476.38958: getting the next task for host managed-node1 41684 1727204476.38969: done getting next task for host managed-node1 41684 1727204476.38973: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41684 1727204476.38978: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204476.38998: getting variables 41684 1727204476.39000: in VariableManager get_vars() 41684 1727204476.39045: Calling all_inventory to load vars for managed-node1 41684 1727204476.39048: Calling groups_inventory to load vars for managed-node1 41684 1727204476.39050: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204476.39060: Calling all_plugins_play to load vars for managed-node1 41684 1727204476.39066: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204476.39070: Calling groups_plugins_play to load vars for managed-node1 41684 1727204476.39588: done sending task result for task 0affcd87-79f5-3839-086d-0000000006a9 41684 1727204476.39914: WORKER PROCESS EXITING 41684 1727204476.39925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204476.41298: done with get_vars() 41684 1727204476.41320: done getting variables 41684 1727204476.41388: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:01:16 -0400 (0:00:00.046) 0:00:32.815 ***** 41684 1727204476.41424: entering _queue_task() for managed-node1/set_fact 41684 1727204476.41756: worker is 1 (out of 1 available) 41684 1727204476.41773: exiting _queue_task() for managed-node1/set_fact 41684 1727204476.41788: done queuing things up, now waiting for results queue to drain 41684 1727204476.41790: waiting for pending results... 41684 1727204476.42041: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41684 1727204476.42196: in run() - task 0affcd87-79f5-3839-086d-0000000006aa 41684 1727204476.42215: variable 'ansible_search_path' from source: unknown 41684 1727204476.42223: variable 'ansible_search_path' from source: unknown 41684 1727204476.42260: calling self._execute() 41684 1727204476.42371: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204476.42386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204476.42406: variable 'omit' from source: magic vars 41684 1727204476.42803: variable 'ansible_distribution_major_version' from source: facts 41684 1727204476.42836: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204476.42997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204476.43191: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204476.43223: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204476.43247: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204476.43280: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204476.43345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204476.43363: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204476.43388: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204476.43407: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204476.43473: variable '__network_is_ostree' from source: set_fact 41684 1727204476.43480: Evaluated conditional (not __network_is_ostree is defined): False 41684 1727204476.43484: when evaluation is False, skipping this task 41684 1727204476.43486: _execute() done 41684 1727204476.43489: dumping result to json 41684 1727204476.43491: done dumping result, returning 41684 1727204476.43500: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-3839-086d-0000000006aa] 41684 1727204476.43505: sending task result for task 0affcd87-79f5-3839-086d-0000000006aa 41684 1727204476.43592: done sending task result for task 0affcd87-79f5-3839-086d-0000000006aa 41684 1727204476.43595: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41684 1727204476.43643: no more pending results, returning what we have 41684 1727204476.43647: results queue empty 41684 1727204476.43648: checking for any_errors_fatal 41684 1727204476.43653: done checking for any_errors_fatal 41684 1727204476.43654: checking for max_fail_percentage 41684 1727204476.43655: done checking for max_fail_percentage 41684 1727204476.43656: checking to see if all hosts have failed and the running result is not ok 41684 1727204476.43657: done checking to see if all hosts have failed 41684 1727204476.43657: getting the remaining hosts for this loop 41684 1727204476.43659: done getting the remaining hosts for this loop 41684 1727204476.43665: getting the next task for host managed-node1 41684 1727204476.43674: done getting next task for host managed-node1 41684 1727204476.43678: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 41684 1727204476.43683: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204476.43702: getting variables 41684 1727204476.43704: in VariableManager get_vars() 41684 1727204476.43743: Calling all_inventory to load vars for managed-node1 41684 1727204476.43746: Calling groups_inventory to load vars for managed-node1 41684 1727204476.43747: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204476.43756: Calling all_plugins_play to load vars for managed-node1 41684 1727204476.43758: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204476.43760: Calling groups_plugins_play to load vars for managed-node1 41684 1727204476.44573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204476.46512: done with get_vars() 41684 1727204476.46539: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:01:16 -0400 (0:00:00.052) 0:00:32.868 ***** 41684 1727204476.46650: entering _queue_task() for managed-node1/service_facts 41684 1727204476.46979: worker is 1 (out of 1 available) 41684 1727204476.46998: exiting _queue_task() for managed-node1/service_facts 41684 1727204476.47011: done queuing things up, now waiting for results queue to drain 41684 1727204476.47012: waiting for pending results... 41684 1727204476.47293: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 41684 1727204476.47462: in run() - task 0affcd87-79f5-3839-086d-0000000006ac 41684 1727204476.47469: variable 'ansible_search_path' from source: unknown 41684 1727204476.47472: variable 'ansible_search_path' from source: unknown 41684 1727204476.47475: calling self._execute() 41684 1727204476.47530: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204476.47538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204476.47548: variable 'omit' from source: magic vars 41684 1727204476.47945: variable 'ansible_distribution_major_version' from source: facts 41684 1727204476.47957: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204476.47963: variable 'omit' from source: magic vars 41684 1727204476.48037: variable 'omit' from source: magic vars 41684 1727204476.48074: variable 'omit' from source: magic vars 41684 1727204476.48121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204476.48156: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204476.48182: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204476.48208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204476.48217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204476.48248: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204476.48251: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204476.48253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204476.48363: Set connection var ansible_connection to ssh 41684 1727204476.48371: Set connection var ansible_pipelining to False 41684 1727204476.48383: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204476.48385: Set connection var ansible_timeout to 10 41684 1727204476.48393: Set connection var ansible_shell_executable to /bin/sh 41684 1727204476.48396: Set connection var ansible_shell_type to sh 41684 1727204476.48421: variable 'ansible_shell_executable' from source: unknown 41684 1727204476.48424: variable 'ansible_connection' from source: unknown 41684 1727204476.48426: variable 'ansible_module_compression' from source: unknown 41684 1727204476.48429: variable 'ansible_shell_type' from source: unknown 41684 1727204476.48431: variable 'ansible_shell_executable' from source: unknown 41684 1727204476.48433: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204476.48438: variable 'ansible_pipelining' from source: unknown 41684 1727204476.48441: variable 'ansible_timeout' from source: unknown 41684 1727204476.48445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204476.48661: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204476.48683: variable 'omit' from source: magic vars 41684 1727204476.48708: starting attempt loop 41684 1727204476.48713: running the handler 41684 1727204476.48716: _low_level_execute_command(): starting 41684 1727204476.48718: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204476.49235: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204476.49252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204476.49271: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204476.49289: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204476.49358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204476.49413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204476.49421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204476.49491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204476.51172: stdout chunk (state=3): >>>/root <<< 41684 1727204476.51285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204476.51336: stderr chunk (state=3): >>><<< 41684 1727204476.51340: stdout chunk (state=3): >>><<< 41684 1727204476.51349: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204476.51360: _low_level_execute_command(): starting 41684 1727204476.51369: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204476.5134985-44278-280658323279142 `" && echo ansible-tmp-1727204476.5134985-44278-280658323279142="` echo /root/.ansible/tmp/ansible-tmp-1727204476.5134985-44278-280658323279142 `" ) && sleep 0' 41684 1727204476.51807: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204476.51813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204476.51843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204476.51847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204476.51857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204476.51900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204476.51909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204476.51982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204476.53832: stdout chunk (state=3): >>>ansible-tmp-1727204476.5134985-44278-280658323279142=/root/.ansible/tmp/ansible-tmp-1727204476.5134985-44278-280658323279142 <<< 41684 1727204476.53946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204476.54000: stderr chunk (state=3): >>><<< 41684 1727204476.54004: stdout chunk (state=3): >>><<< 41684 1727204476.54018: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204476.5134985-44278-280658323279142=/root/.ansible/tmp/ansible-tmp-1727204476.5134985-44278-280658323279142 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204476.54057: variable 'ansible_module_compression' from source: unknown 41684 1727204476.54100: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 41684 1727204476.54134: variable 'ansible_facts' from source: unknown 41684 1727204476.54194: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204476.5134985-44278-280658323279142/AnsiballZ_service_facts.py 41684 1727204476.54303: Sending initial data 41684 1727204476.54306: Sent initial data (162 bytes) 41684 1727204476.54973: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204476.54978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204476.54989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204476.55032: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204476.55035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204476.55038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204476.55090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204476.55099: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204476.55171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204476.56857: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204476.56909: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204476.56961: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpc5ldto78 /root/.ansible/tmp/ansible-tmp-1727204476.5134985-44278-280658323279142/AnsiballZ_service_facts.py <<< 41684 1727204476.57013: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204476.58041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204476.58051: stdout chunk (state=3): >>><<< 41684 1727204476.58061: stderr chunk (state=3): >>><<< 41684 1727204476.58084: done transferring module to remote 41684 1727204476.58099: _low_level_execute_command(): starting 41684 1727204476.58113: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204476.5134985-44278-280658323279142/ /root/.ansible/tmp/ansible-tmp-1727204476.5134985-44278-280658323279142/AnsiballZ_service_facts.py && sleep 0' 41684 1727204476.58819: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204476.58833: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204476.58847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204476.58870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204476.58921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204476.58932: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204476.58944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204476.58963: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204476.58983: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204476.58996: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204476.59012: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204476.59026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204476.59043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204476.59057: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204476.59072: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204476.59087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204476.59173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204476.59194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204476.59216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204476.59307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204476.60985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204476.61033: stderr chunk (state=3): >>><<< 41684 1727204476.61037: stdout chunk (state=3): >>><<< 41684 1727204476.61050: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204476.61056: _low_level_execute_command(): starting 41684 1727204476.61060: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204476.5134985-44278-280658323279142/AnsiballZ_service_facts.py && sleep 0' 41684 1727204476.61755: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204476.61761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204476.61810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204476.61816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41684 1727204476.61831: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204476.61837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204476.61843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204476.61848: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204476.61860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204476.61953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204476.61957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204476.61974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204476.62070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204477.90338: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 41684 1727204477.90354: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stop<<< 41684 1727204477.90361: stdout chunk (state=3): >>>ped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtim<<< 41684 1727204477.90394: stdout chunk (state=3): >>>e-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-up<<< 41684 1727204477.90400: stdout chunk (state=3): >>>date.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 41684 1727204477.91644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204477.91704: stderr chunk (state=3): >>><<< 41684 1727204477.91708: stdout chunk (state=3): >>><<< 41684 1727204477.91741: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204477.92131: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204476.5134985-44278-280658323279142/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204477.92139: _low_level_execute_command(): starting 41684 1727204477.92148: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204476.5134985-44278-280658323279142/ > /dev/null 2>&1 && sleep 0' 41684 1727204477.92630: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204477.92634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204477.92669: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204477.92683: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204477.92737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204477.92742: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204477.92817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204477.94563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204477.94620: stderr chunk (state=3): >>><<< 41684 1727204477.94628: stdout chunk (state=3): >>><<< 41684 1727204477.94642: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204477.94648: handler run complete 41684 1727204477.94753: variable 'ansible_facts' from source: unknown 41684 1727204477.95099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204477.95330: variable 'ansible_facts' from source: unknown 41684 1727204477.95403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204477.95507: attempt loop complete, returning result 41684 1727204477.95510: _execute() done 41684 1727204477.95513: dumping result to json 41684 1727204477.95546: done dumping result, returning 41684 1727204477.95553: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-3839-086d-0000000006ac] 41684 1727204477.95558: sending task result for task 0affcd87-79f5-3839-086d-0000000006ac 41684 1727204477.96266: done sending task result for task 0affcd87-79f5-3839-086d-0000000006ac 41684 1727204477.96269: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41684 1727204477.96320: no more pending results, returning what we have 41684 1727204477.96323: results queue empty 41684 1727204477.96323: checking for any_errors_fatal 41684 1727204477.96326: done checking for any_errors_fatal 41684 1727204477.96326: checking for max_fail_percentage 41684 1727204477.96327: done checking for max_fail_percentage 41684 1727204477.96328: checking to see if all hosts have failed and the running result is not ok 41684 1727204477.96328: done checking to see if all hosts have failed 41684 1727204477.96329: getting the remaining hosts for this loop 41684 1727204477.96330: done getting the remaining hosts for this loop 41684 1727204477.96332: getting the next task for host managed-node1 41684 1727204477.96336: done getting next task for host managed-node1 41684 1727204477.96339: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 41684 1727204477.96343: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204477.96349: getting variables 41684 1727204477.96350: in VariableManager get_vars() 41684 1727204477.96377: Calling all_inventory to load vars for managed-node1 41684 1727204477.96379: Calling groups_inventory to load vars for managed-node1 41684 1727204477.96380: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204477.96387: Calling all_plugins_play to load vars for managed-node1 41684 1727204477.96388: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204477.96390: Calling groups_plugins_play to load vars for managed-node1 41684 1727204477.97091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204477.98029: done with get_vars() 41684 1727204477.98048: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:01:17 -0400 (0:00:01.514) 0:00:34.382 ***** 41684 1727204477.98122: entering _queue_task() for managed-node1/package_facts 41684 1727204477.98349: worker is 1 (out of 1 available) 41684 1727204477.98361: exiting _queue_task() for managed-node1/package_facts 41684 1727204477.98376: done queuing things up, now waiting for results queue to drain 41684 1727204477.98377: waiting for pending results... 41684 1727204477.98566: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 41684 1727204477.98660: in run() - task 0affcd87-79f5-3839-086d-0000000006ad 41684 1727204477.98677: variable 'ansible_search_path' from source: unknown 41684 1727204477.98681: variable 'ansible_search_path' from source: unknown 41684 1727204477.98711: calling self._execute() 41684 1727204477.98786: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204477.98792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204477.98801: variable 'omit' from source: magic vars 41684 1727204477.99080: variable 'ansible_distribution_major_version' from source: facts 41684 1727204477.99091: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204477.99097: variable 'omit' from source: magic vars 41684 1727204477.99146: variable 'omit' from source: magic vars 41684 1727204477.99171: variable 'omit' from source: magic vars 41684 1727204477.99204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204477.99232: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204477.99251: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204477.99266: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204477.99277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204477.99301: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204477.99305: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204477.99307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204477.99383: Set connection var ansible_connection to ssh 41684 1727204477.99386: Set connection var ansible_pipelining to False 41684 1727204477.99392: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204477.99398: Set connection var ansible_timeout to 10 41684 1727204477.99404: Set connection var ansible_shell_executable to /bin/sh 41684 1727204477.99407: Set connection var ansible_shell_type to sh 41684 1727204477.99425: variable 'ansible_shell_executable' from source: unknown 41684 1727204477.99428: variable 'ansible_connection' from source: unknown 41684 1727204477.99431: variable 'ansible_module_compression' from source: unknown 41684 1727204477.99433: variable 'ansible_shell_type' from source: unknown 41684 1727204477.99435: variable 'ansible_shell_executable' from source: unknown 41684 1727204477.99438: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204477.99440: variable 'ansible_pipelining' from source: unknown 41684 1727204477.99442: variable 'ansible_timeout' from source: unknown 41684 1727204477.99447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204477.99592: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204477.99600: variable 'omit' from source: magic vars 41684 1727204477.99605: starting attempt loop 41684 1727204477.99608: running the handler 41684 1727204477.99618: _low_level_execute_command(): starting 41684 1727204477.99625: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204478.00150: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204478.00159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204478.00192: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204478.00208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204478.00260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204478.00275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204478.00344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204478.01873: stdout chunk (state=3): >>>/root <<< 41684 1727204478.01972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204478.02025: stderr chunk (state=3): >>><<< 41684 1727204478.02031: stdout chunk (state=3): >>><<< 41684 1727204478.02053: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204478.02067: _low_level_execute_command(): starting 41684 1727204478.02071: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204478.0205286-44333-1500786058788 `" && echo ansible-tmp-1727204478.0205286-44333-1500786058788="` echo /root/.ansible/tmp/ansible-tmp-1727204478.0205286-44333-1500786058788 `" ) && sleep 0' 41684 1727204478.02524: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204478.02530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204478.02567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204478.02582: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204478.02635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204478.02646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204478.02714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204478.04549: stdout chunk (state=3): >>>ansible-tmp-1727204478.0205286-44333-1500786058788=/root/.ansible/tmp/ansible-tmp-1727204478.0205286-44333-1500786058788 <<< 41684 1727204478.04668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204478.04720: stderr chunk (state=3): >>><<< 41684 1727204478.04724: stdout chunk (state=3): >>><<< 41684 1727204478.04739: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204478.0205286-44333-1500786058788=/root/.ansible/tmp/ansible-tmp-1727204478.0205286-44333-1500786058788 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204478.04782: variable 'ansible_module_compression' from source: unknown 41684 1727204478.04819: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 41684 1727204478.04869: variable 'ansible_facts' from source: unknown 41684 1727204478.04996: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204478.0205286-44333-1500786058788/AnsiballZ_package_facts.py 41684 1727204478.05116: Sending initial data 41684 1727204478.05120: Sent initial data (160 bytes) 41684 1727204478.06350: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204478.06353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204478.06390: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204478.06395: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204478.06398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204478.06469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204478.06472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204478.06478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204478.06536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204478.08227: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204478.08283: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204478.08347: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmp35qthuo4 /root/.ansible/tmp/ansible-tmp-1727204478.0205286-44333-1500786058788/AnsiballZ_package_facts.py <<< 41684 1727204478.08401: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204478.11226: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204478.11484: stderr chunk (state=3): >>><<< 41684 1727204478.11487: stdout chunk (state=3): >>><<< 41684 1727204478.11490: done transferring module to remote 41684 1727204478.11492: _low_level_execute_command(): starting 41684 1727204478.11494: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204478.0205286-44333-1500786058788/ /root/.ansible/tmp/ansible-tmp-1727204478.0205286-44333-1500786058788/AnsiballZ_package_facts.py && sleep 0' 41684 1727204478.12086: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204478.12100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204478.12113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204478.12129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204478.12178: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204478.12190: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204478.12203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204478.12219: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204478.12230: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204478.12240: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204478.12250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204478.12266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204478.12283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204478.12295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204478.12306: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204478.12318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204478.12399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204478.12415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204478.12428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204478.12515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204478.14283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204478.14311: stderr chunk (state=3): >>><<< 41684 1727204478.14314: stdout chunk (state=3): >>><<< 41684 1727204478.14410: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204478.14413: _low_level_execute_command(): starting 41684 1727204478.14416: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204478.0205286-44333-1500786058788/AnsiballZ_package_facts.py && sleep 0' 41684 1727204478.15001: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204478.15016: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204478.15030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204478.15049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204478.15096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204478.15109: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204478.15124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204478.15142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204478.15155: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204478.15171: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204478.15184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204478.15199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204478.15217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204478.15230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204478.15242: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204478.15256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204478.15335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204478.15357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204478.15379: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204478.15485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204478.61302: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53<<< 41684 1727204478.61331: stdout chunk (state=3): >>>.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "<<< 41684 1727204478.61339: stdout chunk (state=3): >>>release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_<<< 41684 1727204478.61348: stdout chunk (state=3): >>>64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "releas<<< 41684 1727204478.61352: stdout chunk (state=3): >>>e": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lib<<< 41684 1727204478.61360: stdout chunk (state=3): >>>ssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "releas<<< 41684 1727204478.61411: stdout chunk (state=3): >>>e": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [<<< 41684 1727204478.61432: stdout chunk (state=3): >>>{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-<<< 41684 1727204478.61436: stdout chunk (state=3): >>>if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "p<<< 41684 1727204478.61445: stdout chunk (state=3): >>>erl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.1<<< 41684 1727204478.61451: stdout chunk (state=3): >>>91", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "sour<<< 41684 1727204478.61467: stdout chunk (state=3): >>>ce": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9",<<< 41684 1727204478.61491: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 41684 1727204478.62927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204478.62989: stderr chunk (state=3): >>><<< 41684 1727204478.62992: stdout chunk (state=3): >>><<< 41684 1727204478.63040: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204478.64476: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204478.0205286-44333-1500786058788/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204478.64498: _low_level_execute_command(): starting 41684 1727204478.64502: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204478.0205286-44333-1500786058788/ > /dev/null 2>&1 && sleep 0' 41684 1727204478.64971: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204478.64984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204478.65010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204478.65028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204478.65071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204478.65083: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204478.65149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204478.66948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204478.67003: stderr chunk (state=3): >>><<< 41684 1727204478.67006: stdout chunk (state=3): >>><<< 41684 1727204478.67019: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204478.67025: handler run complete 41684 1727204478.67529: variable 'ansible_facts' from source: unknown 41684 1727204478.67881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204478.69038: variable 'ansible_facts' from source: unknown 41684 1727204478.69305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204478.69739: attempt loop complete, returning result 41684 1727204478.69749: _execute() done 41684 1727204478.69752: dumping result to json 41684 1727204478.69879: done dumping result, returning 41684 1727204478.69887: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-3839-086d-0000000006ad] 41684 1727204478.69893: sending task result for task 0affcd87-79f5-3839-086d-0000000006ad 41684 1727204478.71239: done sending task result for task 0affcd87-79f5-3839-086d-0000000006ad 41684 1727204478.71243: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41684 1727204478.71334: no more pending results, returning what we have 41684 1727204478.71336: results queue empty 41684 1727204478.71337: checking for any_errors_fatal 41684 1727204478.71340: done checking for any_errors_fatal 41684 1727204478.71341: checking for max_fail_percentage 41684 1727204478.71342: done checking for max_fail_percentage 41684 1727204478.71342: checking to see if all hosts have failed and the running result is not ok 41684 1727204478.71343: done checking to see if all hosts have failed 41684 1727204478.71343: getting the remaining hosts for this loop 41684 1727204478.71344: done getting the remaining hosts for this loop 41684 1727204478.71347: getting the next task for host managed-node1 41684 1727204478.71352: done getting next task for host managed-node1 41684 1727204478.71355: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 41684 1727204478.71358: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204478.71369: getting variables 41684 1727204478.71370: in VariableManager get_vars() 41684 1727204478.71397: Calling all_inventory to load vars for managed-node1 41684 1727204478.71399: Calling groups_inventory to load vars for managed-node1 41684 1727204478.71401: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204478.71408: Calling all_plugins_play to load vars for managed-node1 41684 1727204478.71410: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204478.71411: Calling groups_plugins_play to load vars for managed-node1 41684 1727204478.72128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204478.73051: done with get_vars() 41684 1727204478.73070: done getting variables 41684 1727204478.73114: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:01:18 -0400 (0:00:00.750) 0:00:35.132 ***** 41684 1727204478.73144: entering _queue_task() for managed-node1/debug 41684 1727204478.73375: worker is 1 (out of 1 available) 41684 1727204478.73388: exiting _queue_task() for managed-node1/debug 41684 1727204478.73401: done queuing things up, now waiting for results queue to drain 41684 1727204478.73402: waiting for pending results... 41684 1727204478.73592: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 41684 1727204478.73677: in run() - task 0affcd87-79f5-3839-086d-000000000642 41684 1727204478.73690: variable 'ansible_search_path' from source: unknown 41684 1727204478.73693: variable 'ansible_search_path' from source: unknown 41684 1727204478.73722: calling self._execute() 41684 1727204478.73800: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204478.73804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204478.73812: variable 'omit' from source: magic vars 41684 1727204478.74101: variable 'ansible_distribution_major_version' from source: facts 41684 1727204478.74113: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204478.74118: variable 'omit' from source: magic vars 41684 1727204478.74159: variable 'omit' from source: magic vars 41684 1727204478.74231: variable 'network_provider' from source: set_fact 41684 1727204478.74247: variable 'omit' from source: magic vars 41684 1727204478.74282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204478.74311: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204478.74328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204478.74343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204478.74353: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204478.74379: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204478.74384: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204478.74387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204478.74458: Set connection var ansible_connection to ssh 41684 1727204478.74466: Set connection var ansible_pipelining to False 41684 1727204478.74469: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204478.74475: Set connection var ansible_timeout to 10 41684 1727204478.74481: Set connection var ansible_shell_executable to /bin/sh 41684 1727204478.74484: Set connection var ansible_shell_type to sh 41684 1727204478.74503: variable 'ansible_shell_executable' from source: unknown 41684 1727204478.74507: variable 'ansible_connection' from source: unknown 41684 1727204478.74509: variable 'ansible_module_compression' from source: unknown 41684 1727204478.74511: variable 'ansible_shell_type' from source: unknown 41684 1727204478.74513: variable 'ansible_shell_executable' from source: unknown 41684 1727204478.74516: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204478.74518: variable 'ansible_pipelining' from source: unknown 41684 1727204478.74522: variable 'ansible_timeout' from source: unknown 41684 1727204478.74526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204478.74631: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204478.74639: variable 'omit' from source: magic vars 41684 1727204478.74644: starting attempt loop 41684 1727204478.74648: running the handler 41684 1727204478.74687: handler run complete 41684 1727204478.74698: attempt loop complete, returning result 41684 1727204478.74701: _execute() done 41684 1727204478.74704: dumping result to json 41684 1727204478.74707: done dumping result, returning 41684 1727204478.74714: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-3839-086d-000000000642] 41684 1727204478.74719: sending task result for task 0affcd87-79f5-3839-086d-000000000642 41684 1727204478.74799: done sending task result for task 0affcd87-79f5-3839-086d-000000000642 41684 1727204478.74801: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 41684 1727204478.74891: no more pending results, returning what we have 41684 1727204478.74895: results queue empty 41684 1727204478.74896: checking for any_errors_fatal 41684 1727204478.74904: done checking for any_errors_fatal 41684 1727204478.74904: checking for max_fail_percentage 41684 1727204478.74906: done checking for max_fail_percentage 41684 1727204478.74906: checking to see if all hosts have failed and the running result is not ok 41684 1727204478.74907: done checking to see if all hosts have failed 41684 1727204478.74908: getting the remaining hosts for this loop 41684 1727204478.74910: done getting the remaining hosts for this loop 41684 1727204478.74913: getting the next task for host managed-node1 41684 1727204478.74920: done getting next task for host managed-node1 41684 1727204478.74923: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41684 1727204478.74928: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204478.74938: getting variables 41684 1727204478.74939: in VariableManager get_vars() 41684 1727204478.74978: Calling all_inventory to load vars for managed-node1 41684 1727204478.74981: Calling groups_inventory to load vars for managed-node1 41684 1727204478.74983: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204478.74991: Calling all_plugins_play to load vars for managed-node1 41684 1727204478.74993: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204478.74996: Calling groups_plugins_play to load vars for managed-node1 41684 1727204478.75866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204478.76786: done with get_vars() 41684 1727204478.76802: done getting variables 41684 1727204478.76846: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:01:18 -0400 (0:00:00.037) 0:00:35.170 ***** 41684 1727204478.76873: entering _queue_task() for managed-node1/fail 41684 1727204478.77092: worker is 1 (out of 1 available) 41684 1727204478.77104: exiting _queue_task() for managed-node1/fail 41684 1727204478.77116: done queuing things up, now waiting for results queue to drain 41684 1727204478.77118: waiting for pending results... 41684 1727204478.77302: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41684 1727204478.77386: in run() - task 0affcd87-79f5-3839-086d-000000000643 41684 1727204478.77397: variable 'ansible_search_path' from source: unknown 41684 1727204478.77401: variable 'ansible_search_path' from source: unknown 41684 1727204478.77429: calling self._execute() 41684 1727204478.77505: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204478.77511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204478.77520: variable 'omit' from source: magic vars 41684 1727204478.77802: variable 'ansible_distribution_major_version' from source: facts 41684 1727204478.77814: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204478.77898: variable 'network_state' from source: role '' defaults 41684 1727204478.77906: Evaluated conditional (network_state != {}): False 41684 1727204478.77910: when evaluation is False, skipping this task 41684 1727204478.77913: _execute() done 41684 1727204478.77921: dumping result to json 41684 1727204478.77924: done dumping result, returning 41684 1727204478.77929: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-3839-086d-000000000643] 41684 1727204478.77936: sending task result for task 0affcd87-79f5-3839-086d-000000000643 41684 1727204478.78018: done sending task result for task 0affcd87-79f5-3839-086d-000000000643 41684 1727204478.78022: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41684 1727204478.78074: no more pending results, returning what we have 41684 1727204478.78078: results queue empty 41684 1727204478.78079: checking for any_errors_fatal 41684 1727204478.78085: done checking for any_errors_fatal 41684 1727204478.78086: checking for max_fail_percentage 41684 1727204478.78087: done checking for max_fail_percentage 41684 1727204478.78088: checking to see if all hosts have failed and the running result is not ok 41684 1727204478.78089: done checking to see if all hosts have failed 41684 1727204478.78089: getting the remaining hosts for this loop 41684 1727204478.78091: done getting the remaining hosts for this loop 41684 1727204478.78094: getting the next task for host managed-node1 41684 1727204478.78100: done getting next task for host managed-node1 41684 1727204478.78104: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41684 1727204478.78107: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204478.78123: getting variables 41684 1727204478.78124: in VariableManager get_vars() 41684 1727204478.78167: Calling all_inventory to load vars for managed-node1 41684 1727204478.78170: Calling groups_inventory to load vars for managed-node1 41684 1727204478.78172: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204478.78180: Calling all_plugins_play to load vars for managed-node1 41684 1727204478.78183: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204478.78185: Calling groups_plugins_play to load vars for managed-node1 41684 1727204478.78942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204478.79869: done with get_vars() 41684 1727204478.79886: done getting variables 41684 1727204478.79927: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:01:18 -0400 (0:00:00.030) 0:00:35.201 ***** 41684 1727204478.79948: entering _queue_task() for managed-node1/fail 41684 1727204478.80150: worker is 1 (out of 1 available) 41684 1727204478.80166: exiting _queue_task() for managed-node1/fail 41684 1727204478.80179: done queuing things up, now waiting for results queue to drain 41684 1727204478.80181: waiting for pending results... 41684 1727204478.80352: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41684 1727204478.80435: in run() - task 0affcd87-79f5-3839-086d-000000000644 41684 1727204478.80446: variable 'ansible_search_path' from source: unknown 41684 1727204478.80451: variable 'ansible_search_path' from source: unknown 41684 1727204478.80481: calling self._execute() 41684 1727204478.80553: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204478.80560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204478.80572: variable 'omit' from source: magic vars 41684 1727204478.80835: variable 'ansible_distribution_major_version' from source: facts 41684 1727204478.80846: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204478.80931: variable 'network_state' from source: role '' defaults 41684 1727204478.80939: Evaluated conditional (network_state != {}): False 41684 1727204478.80943: when evaluation is False, skipping this task 41684 1727204478.80948: _execute() done 41684 1727204478.80951: dumping result to json 41684 1727204478.80954: done dumping result, returning 41684 1727204478.80959: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-3839-086d-000000000644] 41684 1727204478.80968: sending task result for task 0affcd87-79f5-3839-086d-000000000644 41684 1727204478.81055: done sending task result for task 0affcd87-79f5-3839-086d-000000000644 41684 1727204478.81058: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41684 1727204478.81119: no more pending results, returning what we have 41684 1727204478.81122: results queue empty 41684 1727204478.81123: checking for any_errors_fatal 41684 1727204478.81127: done checking for any_errors_fatal 41684 1727204478.81128: checking for max_fail_percentage 41684 1727204478.81130: done checking for max_fail_percentage 41684 1727204478.81130: checking to see if all hosts have failed and the running result is not ok 41684 1727204478.81131: done checking to see if all hosts have failed 41684 1727204478.81132: getting the remaining hosts for this loop 41684 1727204478.81133: done getting the remaining hosts for this loop 41684 1727204478.81136: getting the next task for host managed-node1 41684 1727204478.81141: done getting next task for host managed-node1 41684 1727204478.81144: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41684 1727204478.81148: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204478.81166: getting variables 41684 1727204478.81168: in VariableManager get_vars() 41684 1727204478.81207: Calling all_inventory to load vars for managed-node1 41684 1727204478.81209: Calling groups_inventory to load vars for managed-node1 41684 1727204478.81211: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204478.81217: Calling all_plugins_play to load vars for managed-node1 41684 1727204478.81219: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204478.81220: Calling groups_plugins_play to load vars for managed-node1 41684 1727204478.82092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204478.82997: done with get_vars() 41684 1727204478.83013: done getting variables 41684 1727204478.83055: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:01:18 -0400 (0:00:00.031) 0:00:35.232 ***** 41684 1727204478.83080: entering _queue_task() for managed-node1/fail 41684 1727204478.83284: worker is 1 (out of 1 available) 41684 1727204478.83298: exiting _queue_task() for managed-node1/fail 41684 1727204478.83310: done queuing things up, now waiting for results queue to drain 41684 1727204478.83312: waiting for pending results... 41684 1727204478.83488: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41684 1727204478.83587: in run() - task 0affcd87-79f5-3839-086d-000000000645 41684 1727204478.83598: variable 'ansible_search_path' from source: unknown 41684 1727204478.83602: variable 'ansible_search_path' from source: unknown 41684 1727204478.83629: calling self._execute() 41684 1727204478.83703: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204478.83707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204478.83715: variable 'omit' from source: magic vars 41684 1727204478.83982: variable 'ansible_distribution_major_version' from source: facts 41684 1727204478.83993: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204478.84118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204478.85734: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204478.85793: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204478.85821: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204478.85849: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204478.85872: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204478.85928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204478.85949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204478.85972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204478.85999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204478.86009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204478.86076: variable 'ansible_distribution_major_version' from source: facts 41684 1727204478.86089: Evaluated conditional (ansible_distribution_major_version | int > 9): False 41684 1727204478.86092: when evaluation is False, skipping this task 41684 1727204478.86094: _execute() done 41684 1727204478.86097: dumping result to json 41684 1727204478.86099: done dumping result, returning 41684 1727204478.86110: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-3839-086d-000000000645] 41684 1727204478.86114: sending task result for task 0affcd87-79f5-3839-086d-000000000645 41684 1727204478.86204: done sending task result for task 0affcd87-79f5-3839-086d-000000000645 41684 1727204478.86206: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 41684 1727204478.86251: no more pending results, returning what we have 41684 1727204478.86255: results queue empty 41684 1727204478.86256: checking for any_errors_fatal 41684 1727204478.86266: done checking for any_errors_fatal 41684 1727204478.86267: checking for max_fail_percentage 41684 1727204478.86269: done checking for max_fail_percentage 41684 1727204478.86269: checking to see if all hosts have failed and the running result is not ok 41684 1727204478.86270: done checking to see if all hosts have failed 41684 1727204478.86271: getting the remaining hosts for this loop 41684 1727204478.86273: done getting the remaining hosts for this loop 41684 1727204478.86281: getting the next task for host managed-node1 41684 1727204478.86287: done getting next task for host managed-node1 41684 1727204478.86291: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41684 1727204478.86295: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204478.86314: getting variables 41684 1727204478.86320: in VariableManager get_vars() 41684 1727204478.86358: Calling all_inventory to load vars for managed-node1 41684 1727204478.86361: Calling groups_inventory to load vars for managed-node1 41684 1727204478.86366: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204478.86376: Calling all_plugins_play to load vars for managed-node1 41684 1727204478.86378: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204478.86381: Calling groups_plugins_play to load vars for managed-node1 41684 1727204478.87178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204478.88097: done with get_vars() 41684 1727204478.88114: done getting variables 41684 1727204478.88158: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:01:18 -0400 (0:00:00.051) 0:00:35.283 ***** 41684 1727204478.88184: entering _queue_task() for managed-node1/dnf 41684 1727204478.88397: worker is 1 (out of 1 available) 41684 1727204478.88411: exiting _queue_task() for managed-node1/dnf 41684 1727204478.88425: done queuing things up, now waiting for results queue to drain 41684 1727204478.88426: waiting for pending results... 41684 1727204478.88615: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41684 1727204478.88703: in run() - task 0affcd87-79f5-3839-086d-000000000646 41684 1727204478.88715: variable 'ansible_search_path' from source: unknown 41684 1727204478.88718: variable 'ansible_search_path' from source: unknown 41684 1727204478.88746: calling self._execute() 41684 1727204478.88820: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204478.88824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204478.88832: variable 'omit' from source: magic vars 41684 1727204478.89106: variable 'ansible_distribution_major_version' from source: facts 41684 1727204478.89116: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204478.89254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204478.91037: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204478.91084: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204478.91121: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204478.91147: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204478.91171: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204478.91227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204478.91247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204478.91271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204478.91299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204478.91309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204478.91390: variable 'ansible_distribution' from source: facts 41684 1727204478.91393: variable 'ansible_distribution_major_version' from source: facts 41684 1727204478.91406: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 41684 1727204478.91483: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204478.91569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204478.91586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204478.91605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204478.91633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204478.91644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204478.91675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204478.91691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204478.91712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204478.91737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204478.91747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204478.91778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204478.91793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204478.91811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204478.91838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204478.91849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204478.91950: variable 'network_connections' from source: include params 41684 1727204478.91959: variable 'interface0' from source: play vars 41684 1727204478.92011: variable 'interface0' from source: play vars 41684 1727204478.92020: variable 'interface1' from source: play vars 41684 1727204478.92069: variable 'interface1' from source: play vars 41684 1727204478.92114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204478.92224: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204478.92251: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204478.92276: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204478.92300: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204478.92330: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204478.92345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204478.92368: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204478.92392: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204478.92428: variable '__network_team_connections_defined' from source: role '' defaults 41684 1727204478.92593: variable 'network_connections' from source: include params 41684 1727204478.92596: variable 'interface0' from source: play vars 41684 1727204478.92637: variable 'interface0' from source: play vars 41684 1727204478.92643: variable 'interface1' from source: play vars 41684 1727204478.92687: variable 'interface1' from source: play vars 41684 1727204478.92707: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41684 1727204478.92711: when evaluation is False, skipping this task 41684 1727204478.92714: _execute() done 41684 1727204478.92716: dumping result to json 41684 1727204478.92718: done dumping result, returning 41684 1727204478.92725: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-3839-086d-000000000646] 41684 1727204478.92730: sending task result for task 0affcd87-79f5-3839-086d-000000000646 41684 1727204478.92823: done sending task result for task 0affcd87-79f5-3839-086d-000000000646 41684 1727204478.92826: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41684 1727204478.92878: no more pending results, returning what we have 41684 1727204478.92882: results queue empty 41684 1727204478.92883: checking for any_errors_fatal 41684 1727204478.92891: done checking for any_errors_fatal 41684 1727204478.92891: checking for max_fail_percentage 41684 1727204478.92893: done checking for max_fail_percentage 41684 1727204478.92894: checking to see if all hosts have failed and the running result is not ok 41684 1727204478.92894: done checking to see if all hosts have failed 41684 1727204478.92895: getting the remaining hosts for this loop 41684 1727204478.92897: done getting the remaining hosts for this loop 41684 1727204478.92901: getting the next task for host managed-node1 41684 1727204478.92908: done getting next task for host managed-node1 41684 1727204478.92911: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41684 1727204478.92922: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204478.92940: getting variables 41684 1727204478.92942: in VariableManager get_vars() 41684 1727204478.92987: Calling all_inventory to load vars for managed-node1 41684 1727204478.92991: Calling groups_inventory to load vars for managed-node1 41684 1727204478.92993: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204478.93001: Calling all_plugins_play to load vars for managed-node1 41684 1727204478.93004: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204478.93006: Calling groups_plugins_play to load vars for managed-node1 41684 1727204478.93910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204478.94932: done with get_vars() 41684 1727204478.94948: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41684 1727204478.95004: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:01:18 -0400 (0:00:00.068) 0:00:35.351 ***** 41684 1727204478.95028: entering _queue_task() for managed-node1/yum 41684 1727204478.95254: worker is 1 (out of 1 available) 41684 1727204478.95271: exiting _queue_task() for managed-node1/yum 41684 1727204478.95283: done queuing things up, now waiting for results queue to drain 41684 1727204478.95284: waiting for pending results... 41684 1727204478.95475: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41684 1727204478.95646: in run() - task 0affcd87-79f5-3839-086d-000000000647 41684 1727204478.95668: variable 'ansible_search_path' from source: unknown 41684 1727204478.95677: variable 'ansible_search_path' from source: unknown 41684 1727204478.95714: calling self._execute() 41684 1727204478.95805: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204478.95816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204478.95828: variable 'omit' from source: magic vars 41684 1727204478.96187: variable 'ansible_distribution_major_version' from source: facts 41684 1727204478.96205: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204478.96380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204478.98497: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204478.98550: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204478.98582: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204478.98608: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204478.98628: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204478.98688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204478.98706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204478.98725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204478.98755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204478.98767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204478.98835: variable 'ansible_distribution_major_version' from source: facts 41684 1727204478.98854: Evaluated conditional (ansible_distribution_major_version | int < 8): False 41684 1727204478.98857: when evaluation is False, skipping this task 41684 1727204478.98860: _execute() done 41684 1727204478.98866: dumping result to json 41684 1727204478.98869: done dumping result, returning 41684 1727204478.98871: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-3839-086d-000000000647] 41684 1727204478.98874: sending task result for task 0affcd87-79f5-3839-086d-000000000647 41684 1727204478.98959: done sending task result for task 0affcd87-79f5-3839-086d-000000000647 41684 1727204478.98964: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 41684 1727204478.99019: no more pending results, returning what we have 41684 1727204478.99023: results queue empty 41684 1727204478.99024: checking for any_errors_fatal 41684 1727204478.99029: done checking for any_errors_fatal 41684 1727204478.99030: checking for max_fail_percentage 41684 1727204478.99032: done checking for max_fail_percentage 41684 1727204478.99033: checking to see if all hosts have failed and the running result is not ok 41684 1727204478.99033: done checking to see if all hosts have failed 41684 1727204478.99034: getting the remaining hosts for this loop 41684 1727204478.99036: done getting the remaining hosts for this loop 41684 1727204478.99040: getting the next task for host managed-node1 41684 1727204478.99048: done getting next task for host managed-node1 41684 1727204478.99052: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41684 1727204478.99056: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204478.99080: getting variables 41684 1727204478.99082: in VariableManager get_vars() 41684 1727204478.99121: Calling all_inventory to load vars for managed-node1 41684 1727204478.99123: Calling groups_inventory to load vars for managed-node1 41684 1727204478.99125: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204478.99134: Calling all_plugins_play to load vars for managed-node1 41684 1727204478.99136: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204478.99139: Calling groups_plugins_play to load vars for managed-node1 41684 1727204479.00387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204479.07080: done with get_vars() 41684 1727204479.07105: done getting variables 41684 1727204479.07156: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:01:19 -0400 (0:00:00.121) 0:00:35.473 ***** 41684 1727204479.07190: entering _queue_task() for managed-node1/fail 41684 1727204479.07535: worker is 1 (out of 1 available) 41684 1727204479.07548: exiting _queue_task() for managed-node1/fail 41684 1727204479.07565: done queuing things up, now waiting for results queue to drain 41684 1727204479.07568: waiting for pending results... 41684 1727204479.07872: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41684 1727204479.08020: in run() - task 0affcd87-79f5-3839-086d-000000000648 41684 1727204479.08041: variable 'ansible_search_path' from source: unknown 41684 1727204479.08049: variable 'ansible_search_path' from source: unknown 41684 1727204479.08096: calling self._execute() 41684 1727204479.08203: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204479.08215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204479.08236: variable 'omit' from source: magic vars 41684 1727204479.08641: variable 'ansible_distribution_major_version' from source: facts 41684 1727204479.08677: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204479.08758: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204479.08909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204479.10651: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204479.10685: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204479.10720: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204479.10753: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204479.10782: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204479.10858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204479.10897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204479.10920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.10960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204479.10978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204479.11020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204479.11042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204479.11071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.11110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204479.11124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204479.11162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204479.11189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204479.11212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.11250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204479.11263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204479.11446: variable 'network_connections' from source: include params 41684 1727204479.11449: variable 'interface0' from source: play vars 41684 1727204479.11522: variable 'interface0' from source: play vars 41684 1727204479.11538: variable 'interface1' from source: play vars 41684 1727204479.11599: variable 'interface1' from source: play vars 41684 1727204479.11665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204479.11836: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204479.11875: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204479.11920: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204479.11936: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204479.11969: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204479.11985: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204479.12010: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.12031: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204479.12073: variable '__network_team_connections_defined' from source: role '' defaults 41684 1727204479.12228: variable 'network_connections' from source: include params 41684 1727204479.12231: variable 'interface0' from source: play vars 41684 1727204479.12279: variable 'interface0' from source: play vars 41684 1727204479.12286: variable 'interface1' from source: play vars 41684 1727204479.12330: variable 'interface1' from source: play vars 41684 1727204479.12348: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41684 1727204479.12352: when evaluation is False, skipping this task 41684 1727204479.12354: _execute() done 41684 1727204479.12357: dumping result to json 41684 1727204479.12359: done dumping result, returning 41684 1727204479.12370: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-3839-086d-000000000648] 41684 1727204479.12381: sending task result for task 0affcd87-79f5-3839-086d-000000000648 41684 1727204479.12462: done sending task result for task 0affcd87-79f5-3839-086d-000000000648 41684 1727204479.12466: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41684 1727204479.12512: no more pending results, returning what we have 41684 1727204479.12516: results queue empty 41684 1727204479.12517: checking for any_errors_fatal 41684 1727204479.12524: done checking for any_errors_fatal 41684 1727204479.12525: checking for max_fail_percentage 41684 1727204479.12527: done checking for max_fail_percentage 41684 1727204479.12527: checking to see if all hosts have failed and the running result is not ok 41684 1727204479.12528: done checking to see if all hosts have failed 41684 1727204479.12529: getting the remaining hosts for this loop 41684 1727204479.12530: done getting the remaining hosts for this loop 41684 1727204479.12534: getting the next task for host managed-node1 41684 1727204479.12542: done getting next task for host managed-node1 41684 1727204479.12545: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 41684 1727204479.12549: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204479.12575: getting variables 41684 1727204479.12577: in VariableManager get_vars() 41684 1727204479.12621: Calling all_inventory to load vars for managed-node1 41684 1727204479.12624: Calling groups_inventory to load vars for managed-node1 41684 1727204479.12626: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204479.12634: Calling all_plugins_play to load vars for managed-node1 41684 1727204479.12637: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204479.12639: Calling groups_plugins_play to load vars for managed-node1 41684 1727204479.13482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204479.14936: done with get_vars() 41684 1727204479.14965: done getting variables 41684 1727204479.15027: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:01:19 -0400 (0:00:00.078) 0:00:35.552 ***** 41684 1727204479.15068: entering _queue_task() for managed-node1/package 41684 1727204479.15404: worker is 1 (out of 1 available) 41684 1727204479.15419: exiting _queue_task() for managed-node1/package 41684 1727204479.15433: done queuing things up, now waiting for results queue to drain 41684 1727204479.15434: waiting for pending results... 41684 1727204479.15743: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 41684 1727204479.15885: in run() - task 0affcd87-79f5-3839-086d-000000000649 41684 1727204479.15895: variable 'ansible_search_path' from source: unknown 41684 1727204479.15899: variable 'ansible_search_path' from source: unknown 41684 1727204479.15937: calling self._execute() 41684 1727204479.16036: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204479.16041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204479.16062: variable 'omit' from source: magic vars 41684 1727204479.16388: variable 'ansible_distribution_major_version' from source: facts 41684 1727204479.16398: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204479.16538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204479.16736: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204479.16774: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204479.16800: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204479.16855: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204479.16936: variable 'network_packages' from source: role '' defaults 41684 1727204479.17014: variable '__network_provider_setup' from source: role '' defaults 41684 1727204479.17022: variable '__network_service_name_default_nm' from source: role '' defaults 41684 1727204479.17076: variable '__network_service_name_default_nm' from source: role '' defaults 41684 1727204479.17082: variable '__network_packages_default_nm' from source: role '' defaults 41684 1727204479.17128: variable '__network_packages_default_nm' from source: role '' defaults 41684 1727204479.17248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204479.19335: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204479.19397: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204479.19436: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204479.19474: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204479.19500: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204479.19923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204479.19956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204479.19982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.20021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204479.20035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204479.20083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204479.20112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204479.20139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.20185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204479.20189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204479.20393: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41684 1727204479.20659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204479.20662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204479.20671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.20673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204479.20677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204479.20684: variable 'ansible_python' from source: facts 41684 1727204479.20710: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41684 1727204479.20796: variable '__network_wpa_supplicant_required' from source: role '' defaults 41684 1727204479.20882: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41684 1727204479.21017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204479.21039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204479.21063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.21105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204479.21119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204479.21161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204479.21190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204479.21213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.21251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204479.21272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204479.21425: variable 'network_connections' from source: include params 41684 1727204479.21429: variable 'interface0' from source: play vars 41684 1727204479.21528: variable 'interface0' from source: play vars 41684 1727204479.21540: variable 'interface1' from source: play vars 41684 1727204479.21645: variable 'interface1' from source: play vars 41684 1727204479.21718: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204479.21752: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204479.21782: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.21811: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204479.21861: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204479.22111: variable 'network_connections' from source: include params 41684 1727204479.22115: variable 'interface0' from source: play vars 41684 1727204479.22194: variable 'interface0' from source: play vars 41684 1727204479.22201: variable 'interface1' from source: play vars 41684 1727204479.22278: variable 'interface1' from source: play vars 41684 1727204479.22302: variable '__network_packages_default_wireless' from source: role '' defaults 41684 1727204479.22359: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204479.22569: variable 'network_connections' from source: include params 41684 1727204479.22575: variable 'interface0' from source: play vars 41684 1727204479.22620: variable 'interface0' from source: play vars 41684 1727204479.22626: variable 'interface1' from source: play vars 41684 1727204479.22676: variable 'interface1' from source: play vars 41684 1727204479.22694: variable '__network_packages_default_team' from source: role '' defaults 41684 1727204479.22747: variable '__network_team_connections_defined' from source: role '' defaults 41684 1727204479.22956: variable 'network_connections' from source: include params 41684 1727204479.22959: variable 'interface0' from source: play vars 41684 1727204479.23010: variable 'interface0' from source: play vars 41684 1727204479.23017: variable 'interface1' from source: play vars 41684 1727204479.23061: variable 'interface1' from source: play vars 41684 1727204479.23104: variable '__network_service_name_default_initscripts' from source: role '' defaults 41684 1727204479.23146: variable '__network_service_name_default_initscripts' from source: role '' defaults 41684 1727204479.23151: variable '__network_packages_default_initscripts' from source: role '' defaults 41684 1727204479.23198: variable '__network_packages_default_initscripts' from source: role '' defaults 41684 1727204479.23338: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41684 1727204479.23970: variable 'network_connections' from source: include params 41684 1727204479.23974: variable 'interface0' from source: play vars 41684 1727204479.23995: variable 'interface0' from source: play vars 41684 1727204479.24000: variable 'interface1' from source: play vars 41684 1727204479.24060: variable 'interface1' from source: play vars 41684 1727204479.24143: variable 'ansible_distribution' from source: facts 41684 1727204479.24147: variable '__network_rh_distros' from source: role '' defaults 41684 1727204479.24156: variable 'ansible_distribution_major_version' from source: facts 41684 1727204479.24176: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41684 1727204479.24342: variable 'ansible_distribution' from source: facts 41684 1727204479.24345: variable '__network_rh_distros' from source: role '' defaults 41684 1727204479.24352: variable 'ansible_distribution_major_version' from source: facts 41684 1727204479.24369: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41684 1727204479.24532: variable 'ansible_distribution' from source: facts 41684 1727204479.24535: variable '__network_rh_distros' from source: role '' defaults 41684 1727204479.24541: variable 'ansible_distribution_major_version' from source: facts 41684 1727204479.24583: variable 'network_provider' from source: set_fact 41684 1727204479.25184: variable 'ansible_facts' from source: unknown 41684 1727204479.26390: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 41684 1727204479.26394: when evaluation is False, skipping this task 41684 1727204479.26397: _execute() done 41684 1727204479.26399: dumping result to json 41684 1727204479.26402: done dumping result, returning 41684 1727204479.26413: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-3839-086d-000000000649] 41684 1727204479.26416: sending task result for task 0affcd87-79f5-3839-086d-000000000649 41684 1727204479.26522: done sending task result for task 0affcd87-79f5-3839-086d-000000000649 41684 1727204479.26526: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 41684 1727204479.26580: no more pending results, returning what we have 41684 1727204479.26584: results queue empty 41684 1727204479.26584: checking for any_errors_fatal 41684 1727204479.26590: done checking for any_errors_fatal 41684 1727204479.26591: checking for max_fail_percentage 41684 1727204479.26592: done checking for max_fail_percentage 41684 1727204479.26593: checking to see if all hosts have failed and the running result is not ok 41684 1727204479.26594: done checking to see if all hosts have failed 41684 1727204479.26594: getting the remaining hosts for this loop 41684 1727204479.26596: done getting the remaining hosts for this loop 41684 1727204479.26600: getting the next task for host managed-node1 41684 1727204479.26608: done getting next task for host managed-node1 41684 1727204479.26611: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41684 1727204479.26615: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204479.26640: getting variables 41684 1727204479.26642: in VariableManager get_vars() 41684 1727204479.26686: Calling all_inventory to load vars for managed-node1 41684 1727204479.26689: Calling groups_inventory to load vars for managed-node1 41684 1727204479.26691: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204479.26700: Calling all_plugins_play to load vars for managed-node1 41684 1727204479.26702: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204479.26704: Calling groups_plugins_play to load vars for managed-node1 41684 1727204479.27839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204479.28768: done with get_vars() 41684 1727204479.28785: done getting variables 41684 1727204479.28828: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:01:19 -0400 (0:00:00.137) 0:00:35.690 ***** 41684 1727204479.28854: entering _queue_task() for managed-node1/package 41684 1727204479.29087: worker is 1 (out of 1 available) 41684 1727204479.29101: exiting _queue_task() for managed-node1/package 41684 1727204479.29115: done queuing things up, now waiting for results queue to drain 41684 1727204479.29116: waiting for pending results... 41684 1727204479.29305: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41684 1727204479.29398: in run() - task 0affcd87-79f5-3839-086d-00000000064a 41684 1727204479.29409: variable 'ansible_search_path' from source: unknown 41684 1727204479.29412: variable 'ansible_search_path' from source: unknown 41684 1727204479.29442: calling self._execute() 41684 1727204479.29522: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204479.29527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204479.29534: variable 'omit' from source: magic vars 41684 1727204479.29817: variable 'ansible_distribution_major_version' from source: facts 41684 1727204479.29834: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204479.29921: variable 'network_state' from source: role '' defaults 41684 1727204479.29929: Evaluated conditional (network_state != {}): False 41684 1727204479.29933: when evaluation is False, skipping this task 41684 1727204479.29936: _execute() done 41684 1727204479.29939: dumping result to json 41684 1727204479.29942: done dumping result, returning 41684 1727204479.29949: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-3839-086d-00000000064a] 41684 1727204479.29955: sending task result for task 0affcd87-79f5-3839-086d-00000000064a 41684 1727204479.30047: done sending task result for task 0affcd87-79f5-3839-086d-00000000064a 41684 1727204479.30051: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41684 1727204479.30105: no more pending results, returning what we have 41684 1727204479.30108: results queue empty 41684 1727204479.30109: checking for any_errors_fatal 41684 1727204479.30116: done checking for any_errors_fatal 41684 1727204479.30117: checking for max_fail_percentage 41684 1727204479.30118: done checking for max_fail_percentage 41684 1727204479.30119: checking to see if all hosts have failed and the running result is not ok 41684 1727204479.30120: done checking to see if all hosts have failed 41684 1727204479.30120: getting the remaining hosts for this loop 41684 1727204479.30122: done getting the remaining hosts for this loop 41684 1727204479.30126: getting the next task for host managed-node1 41684 1727204479.30132: done getting next task for host managed-node1 41684 1727204479.30136: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41684 1727204479.30140: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204479.30158: getting variables 41684 1727204479.30160: in VariableManager get_vars() 41684 1727204479.30201: Calling all_inventory to load vars for managed-node1 41684 1727204479.30204: Calling groups_inventory to load vars for managed-node1 41684 1727204479.30206: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204479.30215: Calling all_plugins_play to load vars for managed-node1 41684 1727204479.30218: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204479.30220: Calling groups_plugins_play to load vars for managed-node1 41684 1727204479.31003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204479.32239: done with get_vars() 41684 1727204479.32265: done getting variables 41684 1727204479.32323: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:01:19 -0400 (0:00:00.034) 0:00:35.725 ***** 41684 1727204479.32355: entering _queue_task() for managed-node1/package 41684 1727204479.32659: worker is 1 (out of 1 available) 41684 1727204479.32675: exiting _queue_task() for managed-node1/package 41684 1727204479.32689: done queuing things up, now waiting for results queue to drain 41684 1727204479.32690: waiting for pending results... 41684 1727204479.32999: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41684 1727204479.33100: in run() - task 0affcd87-79f5-3839-086d-00000000064b 41684 1727204479.33111: variable 'ansible_search_path' from source: unknown 41684 1727204479.33115: variable 'ansible_search_path' from source: unknown 41684 1727204479.33146: calling self._execute() 41684 1727204479.33223: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204479.33228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204479.33234: variable 'omit' from source: magic vars 41684 1727204479.33522: variable 'ansible_distribution_major_version' from source: facts 41684 1727204479.33533: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204479.33618: variable 'network_state' from source: role '' defaults 41684 1727204479.33626: Evaluated conditional (network_state != {}): False 41684 1727204479.33629: when evaluation is False, skipping this task 41684 1727204479.33632: _execute() done 41684 1727204479.33634: dumping result to json 41684 1727204479.33637: done dumping result, returning 41684 1727204479.33644: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-3839-086d-00000000064b] 41684 1727204479.33650: sending task result for task 0affcd87-79f5-3839-086d-00000000064b 41684 1727204479.33743: done sending task result for task 0affcd87-79f5-3839-086d-00000000064b 41684 1727204479.33746: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41684 1727204479.33828: no more pending results, returning what we have 41684 1727204479.33831: results queue empty 41684 1727204479.33832: checking for any_errors_fatal 41684 1727204479.33837: done checking for any_errors_fatal 41684 1727204479.33838: checking for max_fail_percentage 41684 1727204479.33839: done checking for max_fail_percentage 41684 1727204479.33840: checking to see if all hosts have failed and the running result is not ok 41684 1727204479.33840: done checking to see if all hosts have failed 41684 1727204479.33841: getting the remaining hosts for this loop 41684 1727204479.33842: done getting the remaining hosts for this loop 41684 1727204479.33845: getting the next task for host managed-node1 41684 1727204479.33850: done getting next task for host managed-node1 41684 1727204479.33854: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41684 1727204479.33857: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204479.33878: getting variables 41684 1727204479.33880: in VariableManager get_vars() 41684 1727204479.33917: Calling all_inventory to load vars for managed-node1 41684 1727204479.33919: Calling groups_inventory to load vars for managed-node1 41684 1727204479.33920: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204479.33927: Calling all_plugins_play to load vars for managed-node1 41684 1727204479.33928: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204479.33930: Calling groups_plugins_play to load vars for managed-node1 41684 1727204479.34922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204479.36555: done with get_vars() 41684 1727204479.36583: done getting variables 41684 1727204479.36643: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:01:19 -0400 (0:00:00.043) 0:00:35.768 ***** 41684 1727204479.36682: entering _queue_task() for managed-node1/service 41684 1727204479.36997: worker is 1 (out of 1 available) 41684 1727204479.37009: exiting _queue_task() for managed-node1/service 41684 1727204479.37023: done queuing things up, now waiting for results queue to drain 41684 1727204479.37024: waiting for pending results... 41684 1727204479.37316: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41684 1727204479.37456: in run() - task 0affcd87-79f5-3839-086d-00000000064c 41684 1727204479.37484: variable 'ansible_search_path' from source: unknown 41684 1727204479.37492: variable 'ansible_search_path' from source: unknown 41684 1727204479.37529: calling self._execute() 41684 1727204479.37639: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204479.37652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204479.37669: variable 'omit' from source: magic vars 41684 1727204479.38051: variable 'ansible_distribution_major_version' from source: facts 41684 1727204479.38073: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204479.38207: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204479.38421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204479.40839: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204479.40927: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204479.40976: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204479.41016: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204479.41047: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204479.41134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204479.41172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204479.41208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.41257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204479.41285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204479.41336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204479.41369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204479.41404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.41449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204479.41474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204479.41525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204479.41554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204479.41589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.41637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204479.41659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204479.41854: variable 'network_connections' from source: include params 41684 1727204479.41875: variable 'interface0' from source: play vars 41684 1727204479.41953: variable 'interface0' from source: play vars 41684 1727204479.41974: variable 'interface1' from source: play vars 41684 1727204479.42037: variable 'interface1' from source: play vars 41684 1727204479.42122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204479.42310: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204479.42349: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204479.42391: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204479.42425: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204479.42476: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204479.42508: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204479.42538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.42574: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204479.42634: variable '__network_team_connections_defined' from source: role '' defaults 41684 1727204479.42888: variable 'network_connections' from source: include params 41684 1727204479.42899: variable 'interface0' from source: play vars 41684 1727204479.42973: variable 'interface0' from source: play vars 41684 1727204479.42986: variable 'interface1' from source: play vars 41684 1727204479.43050: variable 'interface1' from source: play vars 41684 1727204479.43083: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41684 1727204479.43090: when evaluation is False, skipping this task 41684 1727204479.43097: _execute() done 41684 1727204479.43103: dumping result to json 41684 1727204479.43109: done dumping result, returning 41684 1727204479.43119: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-3839-086d-00000000064c] 41684 1727204479.43140: sending task result for task 0affcd87-79f5-3839-086d-00000000064c skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41684 1727204479.43302: no more pending results, returning what we have 41684 1727204479.43306: results queue empty 41684 1727204479.43308: checking for any_errors_fatal 41684 1727204479.43315: done checking for any_errors_fatal 41684 1727204479.43316: checking for max_fail_percentage 41684 1727204479.43318: done checking for max_fail_percentage 41684 1727204479.43319: checking to see if all hosts have failed and the running result is not ok 41684 1727204479.43320: done checking to see if all hosts have failed 41684 1727204479.43320: getting the remaining hosts for this loop 41684 1727204479.43322: done getting the remaining hosts for this loop 41684 1727204479.43327: getting the next task for host managed-node1 41684 1727204479.43335: done getting next task for host managed-node1 41684 1727204479.43339: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41684 1727204479.43343: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204479.43369: getting variables 41684 1727204479.43372: in VariableManager get_vars() 41684 1727204479.43421: Calling all_inventory to load vars for managed-node1 41684 1727204479.43424: Calling groups_inventory to load vars for managed-node1 41684 1727204479.43427: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204479.43438: Calling all_plugins_play to load vars for managed-node1 41684 1727204479.43441: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204479.43445: Calling groups_plugins_play to load vars for managed-node1 41684 1727204479.44589: done sending task result for task 0affcd87-79f5-3839-086d-00000000064c 41684 1727204479.44594: WORKER PROCESS EXITING 41684 1727204479.44605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204479.45607: done with get_vars() 41684 1727204479.45629: done getting variables 41684 1727204479.45694: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:01:19 -0400 (0:00:00.090) 0:00:35.858 ***** 41684 1727204479.45726: entering _queue_task() for managed-node1/service 41684 1727204479.46038: worker is 1 (out of 1 available) 41684 1727204479.46050: exiting _queue_task() for managed-node1/service 41684 1727204479.46067: done queuing things up, now waiting for results queue to drain 41684 1727204479.46069: waiting for pending results... 41684 1727204479.46359: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41684 1727204479.46501: in run() - task 0affcd87-79f5-3839-086d-00000000064d 41684 1727204479.46525: variable 'ansible_search_path' from source: unknown 41684 1727204479.46534: variable 'ansible_search_path' from source: unknown 41684 1727204479.46581: calling self._execute() 41684 1727204479.46698: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204479.46709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204479.46728: variable 'omit' from source: magic vars 41684 1727204479.47071: variable 'ansible_distribution_major_version' from source: facts 41684 1727204479.47081: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204479.47195: variable 'network_provider' from source: set_fact 41684 1727204479.47199: variable 'network_state' from source: role '' defaults 41684 1727204479.47209: Evaluated conditional (network_provider == "nm" or network_state != {}): True 41684 1727204479.47214: variable 'omit' from source: magic vars 41684 1727204479.47258: variable 'omit' from source: magic vars 41684 1727204479.47280: variable 'network_service_name' from source: role '' defaults 41684 1727204479.47329: variable 'network_service_name' from source: role '' defaults 41684 1727204479.47413: variable '__network_provider_setup' from source: role '' defaults 41684 1727204479.47416: variable '__network_service_name_default_nm' from source: role '' defaults 41684 1727204479.47465: variable '__network_service_name_default_nm' from source: role '' defaults 41684 1727204479.47474: variable '__network_packages_default_nm' from source: role '' defaults 41684 1727204479.47518: variable '__network_packages_default_nm' from source: role '' defaults 41684 1727204479.47675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204479.49835: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204479.49925: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204479.49970: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204479.50012: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204479.50042: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204479.50133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204479.50170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204479.50203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.50250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204479.50273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204479.50327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204479.50356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204479.50388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.50438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204479.50460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204479.50709: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41684 1727204479.50842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204479.50875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204479.50889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.50914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204479.50924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204479.51004: variable 'ansible_python' from source: facts 41684 1727204479.51022: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41684 1727204479.51084: variable '__network_wpa_supplicant_required' from source: role '' defaults 41684 1727204479.51139: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41684 1727204479.51226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204479.51244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204479.51261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.51293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204479.51303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204479.51337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204479.51356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204479.51377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.51405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204479.51416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204479.51510: variable 'network_connections' from source: include params 41684 1727204479.51519: variable 'interface0' from source: play vars 41684 1727204479.51573: variable 'interface0' from source: play vars 41684 1727204479.51584: variable 'interface1' from source: play vars 41684 1727204479.51638: variable 'interface1' from source: play vars 41684 1727204479.51717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204479.51852: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204479.51891: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204479.51925: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204479.51956: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204479.52003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204479.52024: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204479.52051: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204479.52079: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204479.52118: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204479.52300: variable 'network_connections' from source: include params 41684 1727204479.52306: variable 'interface0' from source: play vars 41684 1727204479.52359: variable 'interface0' from source: play vars 41684 1727204479.52371: variable 'interface1' from source: play vars 41684 1727204479.52428: variable 'interface1' from source: play vars 41684 1727204479.52453: variable '__network_packages_default_wireless' from source: role '' defaults 41684 1727204479.52544: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204479.53227: variable 'network_connections' from source: include params 41684 1727204479.53230: variable 'interface0' from source: play vars 41684 1727204479.53233: variable 'interface0' from source: play vars 41684 1727204479.53235: variable 'interface1' from source: play vars 41684 1727204479.53237: variable 'interface1' from source: play vars 41684 1727204479.53239: variable '__network_packages_default_team' from source: role '' defaults 41684 1727204479.53242: variable '__network_team_connections_defined' from source: role '' defaults 41684 1727204479.53567: variable 'network_connections' from source: include params 41684 1727204479.53571: variable 'interface0' from source: play vars 41684 1727204479.53620: variable 'interface0' from source: play vars 41684 1727204479.53626: variable 'interface1' from source: play vars 41684 1727204479.53699: variable 'interface1' from source: play vars 41684 1727204479.53781: variable '__network_service_name_default_initscripts' from source: role '' defaults 41684 1727204479.53814: variable '__network_service_name_default_initscripts' from source: role '' defaults 41684 1727204479.53818: variable '__network_packages_default_initscripts' from source: role '' defaults 41684 1727204479.54059: variable '__network_packages_default_initscripts' from source: role '' defaults 41684 1727204479.54092: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41684 1727204479.54604: variable 'network_connections' from source: include params 41684 1727204479.54607: variable 'interface0' from source: play vars 41684 1727204479.54675: variable 'interface0' from source: play vars 41684 1727204479.54682: variable 'interface1' from source: play vars 41684 1727204479.54742: variable 'interface1' from source: play vars 41684 1727204479.54760: variable 'ansible_distribution' from source: facts 41684 1727204479.54767: variable '__network_rh_distros' from source: role '' defaults 41684 1727204479.54770: variable 'ansible_distribution_major_version' from source: facts 41684 1727204479.54794: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41684 1727204479.54953: variable 'ansible_distribution' from source: facts 41684 1727204479.54957: variable '__network_rh_distros' from source: role '' defaults 41684 1727204479.54970: variable 'ansible_distribution_major_version' from source: facts 41684 1727204479.54978: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41684 1727204479.55156: variable 'ansible_distribution' from source: facts 41684 1727204479.55159: variable '__network_rh_distros' from source: role '' defaults 41684 1727204479.55165: variable 'ansible_distribution_major_version' from source: facts 41684 1727204479.55207: variable 'network_provider' from source: set_fact 41684 1727204479.55239: variable 'omit' from source: magic vars 41684 1727204479.55269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204479.55288: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204479.55303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204479.55319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204479.55328: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204479.55357: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204479.55360: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204479.55367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204479.55434: Set connection var ansible_connection to ssh 41684 1727204479.55438: Set connection var ansible_pipelining to False 41684 1727204479.55444: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204479.55449: Set connection var ansible_timeout to 10 41684 1727204479.55455: Set connection var ansible_shell_executable to /bin/sh 41684 1727204479.55459: Set connection var ansible_shell_type to sh 41684 1727204479.55483: variable 'ansible_shell_executable' from source: unknown 41684 1727204479.55486: variable 'ansible_connection' from source: unknown 41684 1727204479.55489: variable 'ansible_module_compression' from source: unknown 41684 1727204479.55491: variable 'ansible_shell_type' from source: unknown 41684 1727204479.55493: variable 'ansible_shell_executable' from source: unknown 41684 1727204479.55499: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204479.55502: variable 'ansible_pipelining' from source: unknown 41684 1727204479.55504: variable 'ansible_timeout' from source: unknown 41684 1727204479.55506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204479.55584: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204479.55587: variable 'omit' from source: magic vars 41684 1727204479.55593: starting attempt loop 41684 1727204479.55596: running the handler 41684 1727204479.55651: variable 'ansible_facts' from source: unknown 41684 1727204479.56088: _low_level_execute_command(): starting 41684 1727204479.56093: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204479.56588: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204479.56604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204479.56622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204479.56633: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204479.56682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204479.56694: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204479.56761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204479.58417: stdout chunk (state=3): >>>/root <<< 41684 1727204479.58523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204479.58571: stderr chunk (state=3): >>><<< 41684 1727204479.58574: stdout chunk (state=3): >>><<< 41684 1727204479.58596: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204479.58604: _low_level_execute_command(): starting 41684 1727204479.58611: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204479.5859404-44388-106330438701548 `" && echo ansible-tmp-1727204479.5859404-44388-106330438701548="` echo /root/.ansible/tmp/ansible-tmp-1727204479.5859404-44388-106330438701548 `" ) && sleep 0' 41684 1727204479.59042: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204479.59047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204479.59085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204479.59091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204479.59101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204479.59106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204479.59113: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204479.59119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204479.59185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204479.59189: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204479.59250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204479.61107: stdout chunk (state=3): >>>ansible-tmp-1727204479.5859404-44388-106330438701548=/root/.ansible/tmp/ansible-tmp-1727204479.5859404-44388-106330438701548 <<< 41684 1727204479.61224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204479.61282: stderr chunk (state=3): >>><<< 41684 1727204479.61285: stdout chunk (state=3): >>><<< 41684 1727204479.61298: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204479.5859404-44388-106330438701548=/root/.ansible/tmp/ansible-tmp-1727204479.5859404-44388-106330438701548 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204479.61327: variable 'ansible_module_compression' from source: unknown 41684 1727204479.61370: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 41684 1727204479.61425: variable 'ansible_facts' from source: unknown 41684 1727204479.61561: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204479.5859404-44388-106330438701548/AnsiballZ_systemd.py 41684 1727204479.61678: Sending initial data 41684 1727204479.61681: Sent initial data (156 bytes) 41684 1727204479.62374: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204479.62380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204479.62411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204479.62424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204479.62475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204479.62487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204479.62506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204479.62558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204479.64259: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204479.64310: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204479.64363: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpymbih8p_ /root/.ansible/tmp/ansible-tmp-1727204479.5859404-44388-106330438701548/AnsiballZ_systemd.py <<< 41684 1727204479.64417: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204479.66207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204479.66322: stderr chunk (state=3): >>><<< 41684 1727204479.66326: stdout chunk (state=3): >>><<< 41684 1727204479.66341: done transferring module to remote 41684 1727204479.66351: _low_level_execute_command(): starting 41684 1727204479.66356: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204479.5859404-44388-106330438701548/ /root/.ansible/tmp/ansible-tmp-1727204479.5859404-44388-106330438701548/AnsiballZ_systemd.py && sleep 0' 41684 1727204479.66824: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204479.66829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204479.66869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204479.66882: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204479.66931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204479.66941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204479.66950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204479.67023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204479.68720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204479.68777: stderr chunk (state=3): >>><<< 41684 1727204479.68782: stdout chunk (state=3): >>><<< 41684 1727204479.68795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204479.68798: _low_level_execute_command(): starting 41684 1727204479.68804: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204479.5859404-44388-106330438701548/AnsiballZ_systemd.py && sleep 0' 41684 1727204479.69260: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204479.69285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204479.69298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204479.69308: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204479.69360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204479.69373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204479.69384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204479.69450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204479.94223: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ExecMainStartTimestampMonotonic": "28837083", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 41684 1727204479.94229: stdout chunk (state=3): >>>service", "ControlGroupId": "2418", "MemoryCurrent": "14229504", "MemoryAvailable": "infinity", "CPUUsageNSec": "1471811000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:21 EDT", "StateChangeTimestampMonotonic": "324827295", "InactiveExitTimestamp": "Tue 2024-09-24 14:49:25 EDT", "InactiveExitTimestampMonotonic": "28837278", "ActiveEnterTimestamp": "Tue 2024-09-24 14:49:27 EDT", "ActiveEnterTimestampMonotonic": "30313565", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ConditionTimestampMonotonic": "28833288", "AssertTimestamp": "Tue 2024-09-24 14:49:25 EDT", "AssertTimestampMonotonic": "28833291", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a065c0d4382c4b51bfc5a74ffa3d403d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 41684 1727204479.95576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204479.95655: stderr chunk (state=3): >>><<< 41684 1727204479.95658: stdout chunk (state=3): >>><<< 41684 1727204479.95948: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ExecMainStartTimestampMonotonic": "28837083", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "14229504", "MemoryAvailable": "infinity", "CPUUsageNSec": "1471811000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:21 EDT", "StateChangeTimestampMonotonic": "324827295", "InactiveExitTimestamp": "Tue 2024-09-24 14:49:25 EDT", "InactiveExitTimestampMonotonic": "28837278", "ActiveEnterTimestamp": "Tue 2024-09-24 14:49:27 EDT", "ActiveEnterTimestampMonotonic": "30313565", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ConditionTimestampMonotonic": "28833288", "AssertTimestamp": "Tue 2024-09-24 14:49:25 EDT", "AssertTimestampMonotonic": "28833291", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a065c0d4382c4b51bfc5a74ffa3d403d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204479.95959: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204479.5859404-44388-106330438701548/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204479.95962: _low_level_execute_command(): starting 41684 1727204479.95966: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204479.5859404-44388-106330438701548/ > /dev/null 2>&1 && sleep 0' 41684 1727204479.96532: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204479.96546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204479.96560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204479.96582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204479.96625: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204479.96637: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204479.96650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204479.96671: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204479.96683: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204479.96692: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204479.96706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204479.96718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204479.96732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204479.96743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204479.96753: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204479.96768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204479.96846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204479.96863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204479.96884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204479.96977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204479.98780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204479.98855: stderr chunk (state=3): >>><<< 41684 1727204479.98860: stdout chunk (state=3): >>><<< 41684 1727204479.98974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204479.98978: handler run complete 41684 1727204479.98981: attempt loop complete, returning result 41684 1727204479.98983: _execute() done 41684 1727204479.98985: dumping result to json 41684 1727204479.99084: done dumping result, returning 41684 1727204479.99087: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-3839-086d-00000000064d] 41684 1727204479.99090: sending task result for task 0affcd87-79f5-3839-086d-00000000064d ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41684 1727204479.99829: no more pending results, returning what we have 41684 1727204479.99833: results queue empty 41684 1727204479.99834: checking for any_errors_fatal 41684 1727204479.99839: done checking for any_errors_fatal 41684 1727204479.99840: checking for max_fail_percentage 41684 1727204479.99841: done checking for max_fail_percentage 41684 1727204479.99842: checking to see if all hosts have failed and the running result is not ok 41684 1727204479.99843: done checking to see if all hosts have failed 41684 1727204479.99844: getting the remaining hosts for this loop 41684 1727204479.99845: done getting the remaining hosts for this loop 41684 1727204479.99849: getting the next task for host managed-node1 41684 1727204479.99856: done getting next task for host managed-node1 41684 1727204479.99860: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41684 1727204479.99868: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204479.99880: getting variables 41684 1727204479.99881: in VariableManager get_vars() 41684 1727204479.99922: Calling all_inventory to load vars for managed-node1 41684 1727204479.99925: Calling groups_inventory to load vars for managed-node1 41684 1727204479.99928: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204479.99939: Calling all_plugins_play to load vars for managed-node1 41684 1727204479.99941: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204479.99944: Calling groups_plugins_play to load vars for managed-node1 41684 1727204480.00829: done sending task result for task 0affcd87-79f5-3839-086d-00000000064d 41684 1727204480.00833: WORKER PROCESS EXITING 41684 1727204480.01602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204480.03333: done with get_vars() 41684 1727204480.03377: done getting variables 41684 1727204480.03447: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:01:20 -0400 (0:00:00.577) 0:00:36.436 ***** 41684 1727204480.03488: entering _queue_task() for managed-node1/service 41684 1727204480.03848: worker is 1 (out of 1 available) 41684 1727204480.03862: exiting _queue_task() for managed-node1/service 41684 1727204480.03878: done queuing things up, now waiting for results queue to drain 41684 1727204480.03880: waiting for pending results... 41684 1727204480.04194: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41684 1727204480.04356: in run() - task 0affcd87-79f5-3839-086d-00000000064e 41684 1727204480.04382: variable 'ansible_search_path' from source: unknown 41684 1727204480.04391: variable 'ansible_search_path' from source: unknown 41684 1727204480.04439: calling self._execute() 41684 1727204480.04566: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204480.04580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204480.04596: variable 'omit' from source: magic vars 41684 1727204480.05026: variable 'ansible_distribution_major_version' from source: facts 41684 1727204480.05044: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204480.05171: variable 'network_provider' from source: set_fact 41684 1727204480.05182: Evaluated conditional (network_provider == "nm"): True 41684 1727204480.05286: variable '__network_wpa_supplicant_required' from source: role '' defaults 41684 1727204480.05388: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41684 1727204480.05578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204480.07974: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204480.08061: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204480.08107: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204480.08156: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204480.08192: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204480.08500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204480.08541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204480.08583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204480.08631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204480.08652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204480.08711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204480.08740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204480.08772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204480.08825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204480.08846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204480.08903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204480.08932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204480.08962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204480.09018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204480.09037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204480.09196: variable 'network_connections' from source: include params 41684 1727204480.09218: variable 'interface0' from source: play vars 41684 1727204480.09306: variable 'interface0' from source: play vars 41684 1727204480.09328: variable 'interface1' from source: play vars 41684 1727204480.09401: variable 'interface1' from source: play vars 41684 1727204480.09488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41684 1727204480.09670: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41684 1727204480.09710: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41684 1727204480.09743: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41684 1727204480.09784: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41684 1727204480.09830: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41684 1727204480.09858: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41684 1727204480.09898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204480.09929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41684 1727204480.09995: variable '__network_wireless_connections_defined' from source: role '' defaults 41684 1727204480.10367: variable 'network_connections' from source: include params 41684 1727204480.10422: variable 'interface0' from source: play vars 41684 1727204480.10589: variable 'interface0' from source: play vars 41684 1727204480.10602: variable 'interface1' from source: play vars 41684 1727204480.10759: variable 'interface1' from source: play vars 41684 1727204480.10797: Evaluated conditional (__network_wpa_supplicant_required): False 41684 1727204480.10850: when evaluation is False, skipping this task 41684 1727204480.10871: _execute() done 41684 1727204480.10877: dumping result to json 41684 1727204480.10883: done dumping result, returning 41684 1727204480.10892: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-3839-086d-00000000064e] 41684 1727204480.10900: sending task result for task 0affcd87-79f5-3839-086d-00000000064e skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 41684 1727204480.11054: no more pending results, returning what we have 41684 1727204480.11059: results queue empty 41684 1727204480.11060: checking for any_errors_fatal 41684 1727204480.11081: done checking for any_errors_fatal 41684 1727204480.11082: checking for max_fail_percentage 41684 1727204480.11085: done checking for max_fail_percentage 41684 1727204480.11085: checking to see if all hosts have failed and the running result is not ok 41684 1727204480.11086: done checking to see if all hosts have failed 41684 1727204480.11087: getting the remaining hosts for this loop 41684 1727204480.11089: done getting the remaining hosts for this loop 41684 1727204480.11093: getting the next task for host managed-node1 41684 1727204480.11102: done getting next task for host managed-node1 41684 1727204480.11107: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 41684 1727204480.11111: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204480.11132: getting variables 41684 1727204480.11134: in VariableManager get_vars() 41684 1727204480.11184: Calling all_inventory to load vars for managed-node1 41684 1727204480.11187: Calling groups_inventory to load vars for managed-node1 41684 1727204480.11190: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204480.11201: Calling all_plugins_play to load vars for managed-node1 41684 1727204480.11204: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204480.11207: Calling groups_plugins_play to load vars for managed-node1 41684 1727204480.12307: done sending task result for task 0affcd87-79f5-3839-086d-00000000064e 41684 1727204480.12310: WORKER PROCESS EXITING 41684 1727204480.13737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204480.16247: done with get_vars() 41684 1727204480.16279: done getting variables 41684 1727204480.16341: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:01:20 -0400 (0:00:00.128) 0:00:36.565 ***** 41684 1727204480.16374: entering _queue_task() for managed-node1/service 41684 1727204480.16799: worker is 1 (out of 1 available) 41684 1727204480.16810: exiting _queue_task() for managed-node1/service 41684 1727204480.16823: done queuing things up, now waiting for results queue to drain 41684 1727204480.16824: waiting for pending results... 41684 1727204480.17832: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 41684 1727204480.17999: in run() - task 0affcd87-79f5-3839-086d-00000000064f 41684 1727204480.18022: variable 'ansible_search_path' from source: unknown 41684 1727204480.18030: variable 'ansible_search_path' from source: unknown 41684 1727204480.18081: calling self._execute() 41684 1727204480.18201: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204480.18213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204480.18228: variable 'omit' from source: magic vars 41684 1727204480.18653: variable 'ansible_distribution_major_version' from source: facts 41684 1727204480.18676: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204480.18800: variable 'network_provider' from source: set_fact 41684 1727204480.18813: Evaluated conditional (network_provider == "initscripts"): False 41684 1727204480.18822: when evaluation is False, skipping this task 41684 1727204480.18829: _execute() done 41684 1727204480.18835: dumping result to json 41684 1727204480.18842: done dumping result, returning 41684 1727204480.18853: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-3839-086d-00000000064f] 41684 1727204480.18866: sending task result for task 0affcd87-79f5-3839-086d-00000000064f skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41684 1727204480.19018: no more pending results, returning what we have 41684 1727204480.19022: results queue empty 41684 1727204480.19023: checking for any_errors_fatal 41684 1727204480.19033: done checking for any_errors_fatal 41684 1727204480.19034: checking for max_fail_percentage 41684 1727204480.19036: done checking for max_fail_percentage 41684 1727204480.19036: checking to see if all hosts have failed and the running result is not ok 41684 1727204480.19037: done checking to see if all hosts have failed 41684 1727204480.19038: getting the remaining hosts for this loop 41684 1727204480.19040: done getting the remaining hosts for this loop 41684 1727204480.19045: getting the next task for host managed-node1 41684 1727204480.19053: done getting next task for host managed-node1 41684 1727204480.19057: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41684 1727204480.19062: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204480.19089: getting variables 41684 1727204480.19091: in VariableManager get_vars() 41684 1727204480.19142: Calling all_inventory to load vars for managed-node1 41684 1727204480.19145: Calling groups_inventory to load vars for managed-node1 41684 1727204480.19148: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204480.19160: Calling all_plugins_play to load vars for managed-node1 41684 1727204480.19163: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204480.19168: Calling groups_plugins_play to load vars for managed-node1 41684 1727204480.20184: done sending task result for task 0affcd87-79f5-3839-086d-00000000064f 41684 1727204480.20188: WORKER PROCESS EXITING 41684 1727204480.21027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204480.23687: done with get_vars() 41684 1727204480.23718: done getting variables 41684 1727204480.24096: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:01:20 -0400 (0:00:00.077) 0:00:36.643 ***** 41684 1727204480.24153: entering _queue_task() for managed-node1/copy 41684 1727204480.24576: worker is 1 (out of 1 available) 41684 1727204480.24589: exiting _queue_task() for managed-node1/copy 41684 1727204480.24602: done queuing things up, now waiting for results queue to drain 41684 1727204480.24604: waiting for pending results... 41684 1727204480.24909: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41684 1727204480.25110: in run() - task 0affcd87-79f5-3839-086d-000000000650 41684 1727204480.25211: variable 'ansible_search_path' from source: unknown 41684 1727204480.25219: variable 'ansible_search_path' from source: unknown 41684 1727204480.25263: calling self._execute() 41684 1727204480.25391: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204480.25402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204480.25421: variable 'omit' from source: magic vars 41684 1727204480.25803: variable 'ansible_distribution_major_version' from source: facts 41684 1727204480.25820: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204480.25935: variable 'network_provider' from source: set_fact 41684 1727204480.25945: Evaluated conditional (network_provider == "initscripts"): False 41684 1727204480.25954: when evaluation is False, skipping this task 41684 1727204480.25960: _execute() done 41684 1727204480.25968: dumping result to json 41684 1727204480.25973: done dumping result, returning 41684 1727204480.25982: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-3839-086d-000000000650] 41684 1727204480.25991: sending task result for task 0affcd87-79f5-3839-086d-000000000650 skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 41684 1727204480.26174: no more pending results, returning what we have 41684 1727204480.26178: results queue empty 41684 1727204480.26179: checking for any_errors_fatal 41684 1727204480.26186: done checking for any_errors_fatal 41684 1727204480.26187: checking for max_fail_percentage 41684 1727204480.26189: done checking for max_fail_percentage 41684 1727204480.26189: checking to see if all hosts have failed and the running result is not ok 41684 1727204480.26190: done checking to see if all hosts have failed 41684 1727204480.26191: getting the remaining hosts for this loop 41684 1727204480.26193: done getting the remaining hosts for this loop 41684 1727204480.26197: getting the next task for host managed-node1 41684 1727204480.26205: done getting next task for host managed-node1 41684 1727204480.26209: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41684 1727204480.26214: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204480.26235: getting variables 41684 1727204480.26237: in VariableManager get_vars() 41684 1727204480.26285: Calling all_inventory to load vars for managed-node1 41684 1727204480.26288: Calling groups_inventory to load vars for managed-node1 41684 1727204480.26290: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204480.26302: Calling all_plugins_play to load vars for managed-node1 41684 1727204480.26305: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204480.26308: Calling groups_plugins_play to load vars for managed-node1 41684 1727204480.27110: done sending task result for task 0affcd87-79f5-3839-086d-000000000650 41684 1727204480.27114: WORKER PROCESS EXITING 41684 1727204480.28701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204480.30581: done with get_vars() 41684 1727204480.30610: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:01:20 -0400 (0:00:00.065) 0:00:36.708 ***** 41684 1727204480.30700: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 41684 1727204480.31091: worker is 1 (out of 1 available) 41684 1727204480.31102: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 41684 1727204480.31134: done queuing things up, now waiting for results queue to drain 41684 1727204480.31135: waiting for pending results... 41684 1727204480.31475: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41684 1727204480.31620: in run() - task 0affcd87-79f5-3839-086d-000000000651 41684 1727204480.31646: variable 'ansible_search_path' from source: unknown 41684 1727204480.31657: variable 'ansible_search_path' from source: unknown 41684 1727204480.31707: calling self._execute() 41684 1727204480.31814: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204480.31825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204480.31837: variable 'omit' from source: magic vars 41684 1727204480.32213: variable 'ansible_distribution_major_version' from source: facts 41684 1727204480.32368: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204480.32470: variable 'omit' from source: magic vars 41684 1727204480.32528: variable 'omit' from source: magic vars 41684 1727204480.32918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41684 1727204480.36210: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41684 1727204480.36295: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41684 1727204480.36336: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41684 1727204480.36383: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41684 1727204480.36415: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41684 1727204480.36507: variable 'network_provider' from source: set_fact 41684 1727204480.36653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41684 1727204480.36693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41684 1727204480.36728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41684 1727204480.36776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41684 1727204480.36795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41684 1727204480.36882: variable 'omit' from source: magic vars 41684 1727204480.37007: variable 'omit' from source: magic vars 41684 1727204480.37145: variable 'network_connections' from source: include params 41684 1727204480.37162: variable 'interface0' from source: play vars 41684 1727204480.37250: variable 'interface0' from source: play vars 41684 1727204480.37266: variable 'interface1' from source: play vars 41684 1727204480.37334: variable 'interface1' from source: play vars 41684 1727204480.37515: variable 'omit' from source: magic vars 41684 1727204480.37528: variable '__lsr_ansible_managed' from source: task vars 41684 1727204480.37603: variable '__lsr_ansible_managed' from source: task vars 41684 1727204480.37915: Loaded config def from plugin (lookup/template) 41684 1727204480.37924: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 41684 1727204480.37958: File lookup term: get_ansible_managed.j2 41684 1727204480.37966: variable 'ansible_search_path' from source: unknown 41684 1727204480.37974: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 41684 1727204480.37988: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 41684 1727204480.38014: variable 'ansible_search_path' from source: unknown 41684 1727204480.45447: variable 'ansible_managed' from source: unknown 41684 1727204480.45595: variable 'omit' from source: magic vars 41684 1727204480.45624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204480.45655: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204480.45675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204480.45693: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204480.45703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204480.45733: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204480.45736: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204480.45739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204480.45837: Set connection var ansible_connection to ssh 41684 1727204480.45842: Set connection var ansible_pipelining to False 41684 1727204480.45848: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204480.45854: Set connection var ansible_timeout to 10 41684 1727204480.45865: Set connection var ansible_shell_executable to /bin/sh 41684 1727204480.45868: Set connection var ansible_shell_type to sh 41684 1727204480.45891: variable 'ansible_shell_executable' from source: unknown 41684 1727204480.45894: variable 'ansible_connection' from source: unknown 41684 1727204480.45897: variable 'ansible_module_compression' from source: unknown 41684 1727204480.45899: variable 'ansible_shell_type' from source: unknown 41684 1727204480.45902: variable 'ansible_shell_executable' from source: unknown 41684 1727204480.45904: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204480.45906: variable 'ansible_pipelining' from source: unknown 41684 1727204480.45911: variable 'ansible_timeout' from source: unknown 41684 1727204480.45915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204480.46050: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204480.46066: variable 'omit' from source: magic vars 41684 1727204480.46069: starting attempt loop 41684 1727204480.46071: running the handler 41684 1727204480.46086: _low_level_execute_command(): starting 41684 1727204480.46094: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204480.46799: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204480.46811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204480.46822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204480.46837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204480.46880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204480.46888: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204480.46895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204480.46909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204480.46917: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204480.46923: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204480.46931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204480.46940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204480.46951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204480.46958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204480.46970: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204480.46978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204480.47047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204480.47067: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204480.47075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204480.47171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204480.48780: stdout chunk (state=3): >>>/root <<< 41684 1727204480.48896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204480.49017: stderr chunk (state=3): >>><<< 41684 1727204480.49029: stdout chunk (state=3): >>><<< 41684 1727204480.49156: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204480.49160: _low_level_execute_command(): starting 41684 1727204480.49163: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204480.4905913-44425-33375443809410 `" && echo ansible-tmp-1727204480.4905913-44425-33375443809410="` echo /root/.ansible/tmp/ansible-tmp-1727204480.4905913-44425-33375443809410 `" ) && sleep 0' 41684 1727204480.49878: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204480.49892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204480.49907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204480.49932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204480.49977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204480.49990: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204480.50004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204480.50022: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204480.50043: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204480.50056: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204480.50072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204480.50087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204480.50114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204480.50127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204480.50139: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204480.50173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204480.50285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204480.50386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204480.50639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204480.50719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204480.52570: stdout chunk (state=3): >>>ansible-tmp-1727204480.4905913-44425-33375443809410=/root/.ansible/tmp/ansible-tmp-1727204480.4905913-44425-33375443809410 <<< 41684 1727204480.52752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204480.52756: stdout chunk (state=3): >>><<< 41684 1727204480.52767: stderr chunk (state=3): >>><<< 41684 1727204480.52783: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204480.4905913-44425-33375443809410=/root/.ansible/tmp/ansible-tmp-1727204480.4905913-44425-33375443809410 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204480.52832: variable 'ansible_module_compression' from source: unknown 41684 1727204480.52883: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 41684 1727204480.52916: variable 'ansible_facts' from source: unknown 41684 1727204480.53003: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204480.4905913-44425-33375443809410/AnsiballZ_network_connections.py 41684 1727204480.53243: Sending initial data 41684 1727204480.53247: Sent initial data (167 bytes) 41684 1727204480.54294: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204480.54310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204480.54321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204480.54335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204480.54386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204480.54394: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204480.54404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204480.54418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204480.54426: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204480.54433: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204480.54444: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204480.54450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204480.54466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204480.54471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204480.54480: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204480.54493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204480.54571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204480.54585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204480.54593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204480.54690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204480.56392: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204480.56450: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204480.56503: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpo9oy05h4 /root/.ansible/tmp/ansible-tmp-1727204480.4905913-44425-33375443809410/AnsiballZ_network_connections.py <<< 41684 1727204480.56557: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204480.58843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204480.59085: stderr chunk (state=3): >>><<< 41684 1727204480.59088: stdout chunk (state=3): >>><<< 41684 1727204480.59105: done transferring module to remote 41684 1727204480.59116: _low_level_execute_command(): starting 41684 1727204480.59121: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204480.4905913-44425-33375443809410/ /root/.ansible/tmp/ansible-tmp-1727204480.4905913-44425-33375443809410/AnsiballZ_network_connections.py && sleep 0' 41684 1727204480.59738: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204480.59747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204480.59758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204480.59776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204480.59815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204480.59822: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204480.59833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204480.59845: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204480.59853: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204480.59860: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204480.59869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204480.59879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204480.59890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204480.59898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204480.59904: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204480.59913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204480.59987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204480.60000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204480.60011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204480.60095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204480.61813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204480.61891: stderr chunk (state=3): >>><<< 41684 1727204480.61894: stdout chunk (state=3): >>><<< 41684 1727204480.61911: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204480.61918: _low_level_execute_command(): starting 41684 1727204480.61921: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204480.4905913-44425-33375443809410/AnsiballZ_network_connections.py && sleep 0' 41684 1727204480.63235: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204480.64181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204480.64193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204480.64206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204480.64245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204480.64252: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204480.64266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204480.64277: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204480.64285: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204480.64291: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204480.64299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204480.64309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204480.64319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204480.64326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204480.64333: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204480.64342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204480.64429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204480.64433: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204480.64441: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204480.64527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204481.04403: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cqub8grx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cqub8grx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/368423eb-f869-403f-a6af-2344dcd8e0b3: error=unknown <<< 41684 1727204481.06354: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cqub8grx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cqub8grx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest1/7d6131f1-a08f-4727-b007-3042c5fbcd66: error=unknown <<< 41684 1727204481.06603: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 41684 1727204481.08186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204481.08265: stderr chunk (state=3): >>><<< 41684 1727204481.08269: stdout chunk (state=3): >>><<< 41684 1727204481.08416: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cqub8grx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cqub8grx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/368423eb-f869-403f-a6af-2344dcd8e0b3: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cqub8grx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_cqub8grx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest1/7d6131f1-a08f-4727-b007-3042c5fbcd66: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204481.08425: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'ethtest1', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204480.4905913-44425-33375443809410/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204481.08427: _low_level_execute_command(): starting 41684 1727204481.08430: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204480.4905913-44425-33375443809410/ > /dev/null 2>&1 && sleep 0' 41684 1727204481.09026: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204481.09041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204481.09057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204481.09077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204481.09125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204481.09136: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204481.09149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204481.09168: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204481.09181: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204481.09198: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204481.09211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204481.09223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204481.09237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204481.09250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204481.09260: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204481.09275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204481.09356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204481.09376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204481.09392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204481.09488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204481.11277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204481.11350: stderr chunk (state=3): >>><<< 41684 1727204481.11353: stdout chunk (state=3): >>><<< 41684 1727204481.11470: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204481.11474: handler run complete 41684 1727204481.11477: attempt loop complete, returning result 41684 1727204481.11479: _execute() done 41684 1727204481.11481: dumping result to json 41684 1727204481.11483: done dumping result, returning 41684 1727204481.11485: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-3839-086d-000000000651] 41684 1727204481.11487: sending task result for task 0affcd87-79f5-3839-086d-000000000651 41684 1727204481.11755: done sending task result for task 0affcd87-79f5-3839-086d-000000000651 41684 1727204481.11759: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent", "state": "down" }, { "name": "ethtest1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 41684 1727204481.11872: no more pending results, returning what we have 41684 1727204481.11876: results queue empty 41684 1727204481.11877: checking for any_errors_fatal 41684 1727204481.11882: done checking for any_errors_fatal 41684 1727204481.11883: checking for max_fail_percentage 41684 1727204481.11884: done checking for max_fail_percentage 41684 1727204481.11885: checking to see if all hosts have failed and the running result is not ok 41684 1727204481.11886: done checking to see if all hosts have failed 41684 1727204481.11886: getting the remaining hosts for this loop 41684 1727204481.11888: done getting the remaining hosts for this loop 41684 1727204481.11891: getting the next task for host managed-node1 41684 1727204481.11897: done getting next task for host managed-node1 41684 1727204481.11900: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 41684 1727204481.11904: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204481.11914: getting variables 41684 1727204481.11920: in VariableManager get_vars() 41684 1727204481.11960: Calling all_inventory to load vars for managed-node1 41684 1727204481.11968: Calling groups_inventory to load vars for managed-node1 41684 1727204481.11971: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204481.11979: Calling all_plugins_play to load vars for managed-node1 41684 1727204481.11982: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204481.11985: Calling groups_plugins_play to load vars for managed-node1 41684 1727204481.13706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204481.15572: done with get_vars() 41684 1727204481.15596: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:01:21 -0400 (0:00:00.849) 0:00:37.558 ***** 41684 1727204481.15688: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 41684 1727204481.16014: worker is 1 (out of 1 available) 41684 1727204481.16025: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 41684 1727204481.16041: done queuing things up, now waiting for results queue to drain 41684 1727204481.16042: waiting for pending results... 41684 1727204481.16348: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 41684 1727204481.16494: in run() - task 0affcd87-79f5-3839-086d-000000000652 41684 1727204481.16516: variable 'ansible_search_path' from source: unknown 41684 1727204481.16524: variable 'ansible_search_path' from source: unknown 41684 1727204481.16572: calling self._execute() 41684 1727204481.16676: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204481.16688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204481.16703: variable 'omit' from source: magic vars 41684 1727204481.17094: variable 'ansible_distribution_major_version' from source: facts 41684 1727204481.17113: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204481.17242: variable 'network_state' from source: role '' defaults 41684 1727204481.17265: Evaluated conditional (network_state != {}): False 41684 1727204481.17273: when evaluation is False, skipping this task 41684 1727204481.17281: _execute() done 41684 1727204481.17289: dumping result to json 41684 1727204481.17296: done dumping result, returning 41684 1727204481.17306: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-3839-086d-000000000652] 41684 1727204481.17318: sending task result for task 0affcd87-79f5-3839-086d-000000000652 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41684 1727204481.17478: no more pending results, returning what we have 41684 1727204481.17482: results queue empty 41684 1727204481.17483: checking for any_errors_fatal 41684 1727204481.17494: done checking for any_errors_fatal 41684 1727204481.17495: checking for max_fail_percentage 41684 1727204481.17497: done checking for max_fail_percentage 41684 1727204481.17498: checking to see if all hosts have failed and the running result is not ok 41684 1727204481.17499: done checking to see if all hosts have failed 41684 1727204481.17499: getting the remaining hosts for this loop 41684 1727204481.17501: done getting the remaining hosts for this loop 41684 1727204481.17506: getting the next task for host managed-node1 41684 1727204481.17513: done getting next task for host managed-node1 41684 1727204481.17518: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41684 1727204481.17523: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204481.17547: getting variables 41684 1727204481.17549: in VariableManager get_vars() 41684 1727204481.17602: Calling all_inventory to load vars for managed-node1 41684 1727204481.17605: Calling groups_inventory to load vars for managed-node1 41684 1727204481.17608: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204481.17620: Calling all_plugins_play to load vars for managed-node1 41684 1727204481.17623: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204481.17627: Calling groups_plugins_play to load vars for managed-node1 41684 1727204481.18783: done sending task result for task 0affcd87-79f5-3839-086d-000000000652 41684 1727204481.18786: WORKER PROCESS EXITING 41684 1727204481.19311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204481.21009: done with get_vars() 41684 1727204481.21034: done getting variables 41684 1727204481.21096: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:01:21 -0400 (0:00:00.054) 0:00:37.612 ***** 41684 1727204481.21131: entering _queue_task() for managed-node1/debug 41684 1727204481.21437: worker is 1 (out of 1 available) 41684 1727204481.21448: exiting _queue_task() for managed-node1/debug 41684 1727204481.21461: done queuing things up, now waiting for results queue to drain 41684 1727204481.21462: waiting for pending results... 41684 1727204481.21751: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41684 1727204481.21890: in run() - task 0affcd87-79f5-3839-086d-000000000653 41684 1727204481.21916: variable 'ansible_search_path' from source: unknown 41684 1727204481.21925: variable 'ansible_search_path' from source: unknown 41684 1727204481.21968: calling self._execute() 41684 1727204481.22063: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204481.22075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204481.22087: variable 'omit' from source: magic vars 41684 1727204481.22478: variable 'ansible_distribution_major_version' from source: facts 41684 1727204481.22498: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204481.22510: variable 'omit' from source: magic vars 41684 1727204481.22576: variable 'omit' from source: magic vars 41684 1727204481.22615: variable 'omit' from source: magic vars 41684 1727204481.22660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204481.22709: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204481.22736: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204481.22759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204481.22782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204481.22816: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204481.22825: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204481.22834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204481.22949: Set connection var ansible_connection to ssh 41684 1727204481.22961: Set connection var ansible_pipelining to False 41684 1727204481.22975: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204481.22986: Set connection var ansible_timeout to 10 41684 1727204481.23003: Set connection var ansible_shell_executable to /bin/sh 41684 1727204481.23011: Set connection var ansible_shell_type to sh 41684 1727204481.23041: variable 'ansible_shell_executable' from source: unknown 41684 1727204481.23049: variable 'ansible_connection' from source: unknown 41684 1727204481.23057: variable 'ansible_module_compression' from source: unknown 41684 1727204481.23065: variable 'ansible_shell_type' from source: unknown 41684 1727204481.23074: variable 'ansible_shell_executable' from source: unknown 41684 1727204481.23080: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204481.23088: variable 'ansible_pipelining' from source: unknown 41684 1727204481.23094: variable 'ansible_timeout' from source: unknown 41684 1727204481.23107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204481.23257: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204481.23277: variable 'omit' from source: magic vars 41684 1727204481.23288: starting attempt loop 41684 1727204481.23295: running the handler 41684 1727204481.23432: variable '__network_connections_result' from source: set_fact 41684 1727204481.23490: handler run complete 41684 1727204481.23516: attempt loop complete, returning result 41684 1727204481.23523: _execute() done 41684 1727204481.23530: dumping result to json 41684 1727204481.23542: done dumping result, returning 41684 1727204481.23554: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-3839-086d-000000000653] 41684 1727204481.23570: sending task result for task 0affcd87-79f5-3839-086d-000000000653 ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 41684 1727204481.23739: no more pending results, returning what we have 41684 1727204481.23743: results queue empty 41684 1727204481.23744: checking for any_errors_fatal 41684 1727204481.23749: done checking for any_errors_fatal 41684 1727204481.23750: checking for max_fail_percentage 41684 1727204481.23752: done checking for max_fail_percentage 41684 1727204481.23753: checking to see if all hosts have failed and the running result is not ok 41684 1727204481.23754: done checking to see if all hosts have failed 41684 1727204481.23754: getting the remaining hosts for this loop 41684 1727204481.23756: done getting the remaining hosts for this loop 41684 1727204481.23760: getting the next task for host managed-node1 41684 1727204481.23770: done getting next task for host managed-node1 41684 1727204481.23775: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41684 1727204481.23779: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204481.23792: getting variables 41684 1727204481.23794: in VariableManager get_vars() 41684 1727204481.23840: Calling all_inventory to load vars for managed-node1 41684 1727204481.23843: Calling groups_inventory to load vars for managed-node1 41684 1727204481.23846: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204481.23857: Calling all_plugins_play to load vars for managed-node1 41684 1727204481.23860: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204481.23863: Calling groups_plugins_play to load vars for managed-node1 41684 1727204481.24882: done sending task result for task 0affcd87-79f5-3839-086d-000000000653 41684 1727204481.24886: WORKER PROCESS EXITING 41684 1727204481.25742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204481.27439: done with get_vars() 41684 1727204481.27463: done getting variables 41684 1727204481.27523: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:01:21 -0400 (0:00:00.064) 0:00:37.677 ***** 41684 1727204481.27556: entering _queue_task() for managed-node1/debug 41684 1727204481.27869: worker is 1 (out of 1 available) 41684 1727204481.27882: exiting _queue_task() for managed-node1/debug 41684 1727204481.27894: done queuing things up, now waiting for results queue to drain 41684 1727204481.27895: waiting for pending results... 41684 1727204481.28177: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41684 1727204481.28320: in run() - task 0affcd87-79f5-3839-086d-000000000654 41684 1727204481.28347: variable 'ansible_search_path' from source: unknown 41684 1727204481.28355: variable 'ansible_search_path' from source: unknown 41684 1727204481.28400: calling self._execute() 41684 1727204481.28512: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204481.28523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204481.28537: variable 'omit' from source: magic vars 41684 1727204481.28938: variable 'ansible_distribution_major_version' from source: facts 41684 1727204481.28956: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204481.28971: variable 'omit' from source: magic vars 41684 1727204481.29038: variable 'omit' from source: magic vars 41684 1727204481.29080: variable 'omit' from source: magic vars 41684 1727204481.29133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204481.29179: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204481.29212: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204481.29235: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204481.29252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204481.29290: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204481.29300: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204481.29308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204481.29415: Set connection var ansible_connection to ssh 41684 1727204481.29433: Set connection var ansible_pipelining to False 41684 1727204481.29443: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204481.29453: Set connection var ansible_timeout to 10 41684 1727204481.29467: Set connection var ansible_shell_executable to /bin/sh 41684 1727204481.29475: Set connection var ansible_shell_type to sh 41684 1727204481.29506: variable 'ansible_shell_executable' from source: unknown 41684 1727204481.29514: variable 'ansible_connection' from source: unknown 41684 1727204481.29522: variable 'ansible_module_compression' from source: unknown 41684 1727204481.29533: variable 'ansible_shell_type' from source: unknown 41684 1727204481.29541: variable 'ansible_shell_executable' from source: unknown 41684 1727204481.29547: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204481.29554: variable 'ansible_pipelining' from source: unknown 41684 1727204481.29566: variable 'ansible_timeout' from source: unknown 41684 1727204481.29574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204481.29724: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204481.29739: variable 'omit' from source: magic vars 41684 1727204481.29754: starting attempt loop 41684 1727204481.29761: running the handler 41684 1727204481.29817: variable '__network_connections_result' from source: set_fact 41684 1727204481.29908: variable '__network_connections_result' from source: set_fact 41684 1727204481.30036: handler run complete 41684 1727204481.30072: attempt loop complete, returning result 41684 1727204481.30080: _execute() done 41684 1727204481.30088: dumping result to json 41684 1727204481.30097: done dumping result, returning 41684 1727204481.30109: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-3839-086d-000000000654] 41684 1727204481.30119: sending task result for task 0affcd87-79f5-3839-086d-000000000654 ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent", "state": "down" }, { "name": "ethtest1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 41684 1727204481.30328: no more pending results, returning what we have 41684 1727204481.30332: results queue empty 41684 1727204481.30334: checking for any_errors_fatal 41684 1727204481.30341: done checking for any_errors_fatal 41684 1727204481.30342: checking for max_fail_percentage 41684 1727204481.30344: done checking for max_fail_percentage 41684 1727204481.30345: checking to see if all hosts have failed and the running result is not ok 41684 1727204481.30346: done checking to see if all hosts have failed 41684 1727204481.30346: getting the remaining hosts for this loop 41684 1727204481.30348: done getting the remaining hosts for this loop 41684 1727204481.30352: getting the next task for host managed-node1 41684 1727204481.30360: done getting next task for host managed-node1 41684 1727204481.30366: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41684 1727204481.30371: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204481.30385: getting variables 41684 1727204481.30387: in VariableManager get_vars() 41684 1727204481.30434: Calling all_inventory to load vars for managed-node1 41684 1727204481.30437: Calling groups_inventory to load vars for managed-node1 41684 1727204481.30439: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204481.30452: Calling all_plugins_play to load vars for managed-node1 41684 1727204481.30455: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204481.30458: Calling groups_plugins_play to load vars for managed-node1 41684 1727204481.31383: done sending task result for task 0affcd87-79f5-3839-086d-000000000654 41684 1727204481.31387: WORKER PROCESS EXITING 41684 1727204481.32249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204481.34060: done with get_vars() 41684 1727204481.34096: done getting variables 41684 1727204481.34160: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:01:21 -0400 (0:00:00.066) 0:00:37.743 ***** 41684 1727204481.34199: entering _queue_task() for managed-node1/debug 41684 1727204481.34554: worker is 1 (out of 1 available) 41684 1727204481.34570: exiting _queue_task() for managed-node1/debug 41684 1727204481.34585: done queuing things up, now waiting for results queue to drain 41684 1727204481.34586: waiting for pending results... 41684 1727204481.34990: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41684 1727204481.35140: in run() - task 0affcd87-79f5-3839-086d-000000000655 41684 1727204481.35166: variable 'ansible_search_path' from source: unknown 41684 1727204481.35175: variable 'ansible_search_path' from source: unknown 41684 1727204481.35218: calling self._execute() 41684 1727204481.35338: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204481.35471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204481.35499: variable 'omit' from source: magic vars 41684 1727204481.36487: variable 'ansible_distribution_major_version' from source: facts 41684 1727204481.36589: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204481.36820: variable 'network_state' from source: role '' defaults 41684 1727204481.36838: Evaluated conditional (network_state != {}): False 41684 1727204481.36847: when evaluation is False, skipping this task 41684 1727204481.36855: _execute() done 41684 1727204481.36862: dumping result to json 41684 1727204481.36877: done dumping result, returning 41684 1727204481.36887: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-3839-086d-000000000655] 41684 1727204481.36898: sending task result for task 0affcd87-79f5-3839-086d-000000000655 skipping: [managed-node1] => { "false_condition": "network_state != {}" } 41684 1727204481.37195: no more pending results, returning what we have 41684 1727204481.37200: results queue empty 41684 1727204481.37201: checking for any_errors_fatal 41684 1727204481.37213: done checking for any_errors_fatal 41684 1727204481.37214: checking for max_fail_percentage 41684 1727204481.37216: done checking for max_fail_percentage 41684 1727204481.37217: checking to see if all hosts have failed and the running result is not ok 41684 1727204481.37217: done checking to see if all hosts have failed 41684 1727204481.37218: getting the remaining hosts for this loop 41684 1727204481.37220: done getting the remaining hosts for this loop 41684 1727204481.37226: getting the next task for host managed-node1 41684 1727204481.37233: done getting next task for host managed-node1 41684 1727204481.37238: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 41684 1727204481.37244: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204481.37271: getting variables 41684 1727204481.37273: in VariableManager get_vars() 41684 1727204481.37325: Calling all_inventory to load vars for managed-node1 41684 1727204481.37329: Calling groups_inventory to load vars for managed-node1 41684 1727204481.37331: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204481.37345: Calling all_plugins_play to load vars for managed-node1 41684 1727204481.37349: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204481.37353: Calling groups_plugins_play to load vars for managed-node1 41684 1727204481.38616: done sending task result for task 0affcd87-79f5-3839-086d-000000000655 41684 1727204481.38620: WORKER PROCESS EXITING 41684 1727204481.39801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204481.41562: done with get_vars() 41684 1727204481.41595: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:01:21 -0400 (0:00:00.075) 0:00:37.818 ***** 41684 1727204481.41702: entering _queue_task() for managed-node1/ping 41684 1727204481.42432: worker is 1 (out of 1 available) 41684 1727204481.42446: exiting _queue_task() for managed-node1/ping 41684 1727204481.42459: done queuing things up, now waiting for results queue to drain 41684 1727204481.42460: waiting for pending results... 41684 1727204481.43077: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 41684 1727204481.43215: in run() - task 0affcd87-79f5-3839-086d-000000000656 41684 1727204481.43234: variable 'ansible_search_path' from source: unknown 41684 1727204481.43241: variable 'ansible_search_path' from source: unknown 41684 1727204481.43284: calling self._execute() 41684 1727204481.43388: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204481.43399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204481.43411: variable 'omit' from source: magic vars 41684 1727204481.43804: variable 'ansible_distribution_major_version' from source: facts 41684 1727204481.43823: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204481.43835: variable 'omit' from source: magic vars 41684 1727204481.43903: variable 'omit' from source: magic vars 41684 1727204481.43943: variable 'omit' from source: magic vars 41684 1727204481.43996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204481.44043: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204481.44078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204481.44101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204481.44206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204481.44237: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204481.44245: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204481.44252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204481.44381: Set connection var ansible_connection to ssh 41684 1727204481.44392: Set connection var ansible_pipelining to False 41684 1727204481.44401: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204481.44414: Set connection var ansible_timeout to 10 41684 1727204481.44426: Set connection var ansible_shell_executable to /bin/sh 41684 1727204481.44432: Set connection var ansible_shell_type to sh 41684 1727204481.44461: variable 'ansible_shell_executable' from source: unknown 41684 1727204481.44472: variable 'ansible_connection' from source: unknown 41684 1727204481.44482: variable 'ansible_module_compression' from source: unknown 41684 1727204481.44492: variable 'ansible_shell_type' from source: unknown 41684 1727204481.44500: variable 'ansible_shell_executable' from source: unknown 41684 1727204481.44506: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204481.44513: variable 'ansible_pipelining' from source: unknown 41684 1727204481.44521: variable 'ansible_timeout' from source: unknown 41684 1727204481.44527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204481.44742: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204481.44757: variable 'omit' from source: magic vars 41684 1727204481.44768: starting attempt loop 41684 1727204481.44775: running the handler 41684 1727204481.44794: _low_level_execute_command(): starting 41684 1727204481.44806: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204481.45577: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204481.45592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204481.45608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204481.45624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204481.45666: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204481.45679: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204481.45693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204481.45710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204481.45725: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204481.45735: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204481.45747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204481.45759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204481.45777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204481.45789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204481.45800: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204481.45813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204481.45890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204481.45905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204481.45918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204481.46056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204481.47594: stdout chunk (state=3): >>>/root <<< 41684 1727204481.47767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204481.47779: stderr chunk (state=3): >>><<< 41684 1727204481.47782: stdout chunk (state=3): >>><<< 41684 1727204481.47806: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204481.47820: _low_level_execute_command(): starting 41684 1727204481.47827: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204481.4780576-44476-206389587383711 `" && echo ansible-tmp-1727204481.4780576-44476-206389587383711="` echo /root/.ansible/tmp/ansible-tmp-1727204481.4780576-44476-206389587383711 `" ) && sleep 0' 41684 1727204481.49461: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204481.49543: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204481.49553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204481.49571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204481.49607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204481.49614: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204481.49624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204481.49636: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204481.49649: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204481.49655: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204481.49662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204481.49675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204481.49689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204481.49760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204481.49770: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204481.49779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204481.49843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204481.49880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204481.49894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204481.50050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204481.51884: stdout chunk (state=3): >>>ansible-tmp-1727204481.4780576-44476-206389587383711=/root/.ansible/tmp/ansible-tmp-1727204481.4780576-44476-206389587383711 <<< 41684 1727204481.52076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204481.52080: stdout chunk (state=3): >>><<< 41684 1727204481.52087: stderr chunk (state=3): >>><<< 41684 1727204481.52112: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204481.4780576-44476-206389587383711=/root/.ansible/tmp/ansible-tmp-1727204481.4780576-44476-206389587383711 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204481.52161: variable 'ansible_module_compression' from source: unknown 41684 1727204481.52208: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 41684 1727204481.52244: variable 'ansible_facts' from source: unknown 41684 1727204481.52315: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204481.4780576-44476-206389587383711/AnsiballZ_ping.py 41684 1727204481.52857: Sending initial data 41684 1727204481.52861: Sent initial data (153 bytes) 41684 1727204481.56142: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204481.56384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204481.56394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204481.56408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204481.56447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204481.56454: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204481.56465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204481.56483: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204481.56490: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204481.56497: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204481.56505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204481.56514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204481.56525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204481.56532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204481.56538: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204481.56547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204481.56625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204481.56643: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204481.56785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204481.56864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204481.58573: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204481.58621: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204481.58681: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmp8o14zc1q /root/.ansible/tmp/ansible-tmp-1727204481.4780576-44476-206389587383711/AnsiballZ_ping.py <<< 41684 1727204481.58734: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204481.60074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204481.60078: stderr chunk (state=3): >>><<< 41684 1727204481.60081: stdout chunk (state=3): >>><<< 41684 1727204481.60104: done transferring module to remote 41684 1727204481.60114: _low_level_execute_command(): starting 41684 1727204481.60121: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204481.4780576-44476-206389587383711/ /root/.ansible/tmp/ansible-tmp-1727204481.4780576-44476-206389587383711/AnsiballZ_ping.py && sleep 0' 41684 1727204481.62211: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204481.62217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204481.62380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204481.62384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204481.62457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204481.62463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204481.62540: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204481.62636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204481.63399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204481.64478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204481.64482: stderr chunk (state=3): >>><<< 41684 1727204481.64485: stdout chunk (state=3): >>><<< 41684 1727204481.64506: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204481.64509: _low_level_execute_command(): starting 41684 1727204481.64511: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204481.4780576-44476-206389587383711/AnsiballZ_ping.py && sleep 0' 41684 1727204481.65933: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204481.66580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204481.66590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204481.66604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204481.66642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204481.66649: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204481.66659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204481.66677: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204481.66684: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204481.66691: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204481.66698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204481.66707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204481.66719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204481.66726: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204481.66732: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204481.66741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204481.66819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204481.66836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204481.66848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204481.66942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204481.79750: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 41684 1727204481.80741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204481.80745: stdout chunk (state=3): >>><<< 41684 1727204481.80747: stderr chunk (state=3): >>><<< 41684 1727204481.80769: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204481.80795: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204481.4780576-44476-206389587383711/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204481.80805: _low_level_execute_command(): starting 41684 1727204481.80809: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204481.4780576-44476-206389587383711/ > /dev/null 2>&1 && sleep 0' 41684 1727204481.81584: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204481.81589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204481.81608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204481.81644: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204481.81650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 41684 1727204481.81655: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204481.81673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204481.81679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204481.81771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204481.81789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204481.81878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204481.83817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204481.83820: stdout chunk (state=3): >>><<< 41684 1727204481.83823: stderr chunk (state=3): >>><<< 41684 1727204481.83825: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204481.83830: handler run complete 41684 1727204481.83861: attempt loop complete, returning result 41684 1727204481.83868: _execute() done 41684 1727204481.83872: dumping result to json 41684 1727204481.83878: done dumping result, returning 41684 1727204481.83886: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-3839-086d-000000000656] 41684 1727204481.83912: sending task result for task 0affcd87-79f5-3839-086d-000000000656 41684 1727204481.84022: done sending task result for task 0affcd87-79f5-3839-086d-000000000656 41684 1727204481.84025: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 41684 1727204481.84103: no more pending results, returning what we have 41684 1727204481.84107: results queue empty 41684 1727204481.84108: checking for any_errors_fatal 41684 1727204481.84114: done checking for any_errors_fatal 41684 1727204481.84115: checking for max_fail_percentage 41684 1727204481.84117: done checking for max_fail_percentage 41684 1727204481.84118: checking to see if all hosts have failed and the running result is not ok 41684 1727204481.84118: done checking to see if all hosts have failed 41684 1727204481.84119: getting the remaining hosts for this loop 41684 1727204481.84120: done getting the remaining hosts for this loop 41684 1727204481.84124: getting the next task for host managed-node1 41684 1727204481.84134: done getting next task for host managed-node1 41684 1727204481.84136: ^ task is: TASK: meta (role_complete) 41684 1727204481.84140: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204481.84150: getting variables 41684 1727204481.84152: in VariableManager get_vars() 41684 1727204481.84204: Calling all_inventory to load vars for managed-node1 41684 1727204481.84207: Calling groups_inventory to load vars for managed-node1 41684 1727204481.84209: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204481.84219: Calling all_plugins_play to load vars for managed-node1 41684 1727204481.84222: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204481.84224: Calling groups_plugins_play to load vars for managed-node1 41684 1727204481.85909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204481.87291: done with get_vars() 41684 1727204481.87318: done getting variables 41684 1727204481.87405: done queuing things up, now waiting for results queue to drain 41684 1727204481.87408: results queue empty 41684 1727204481.87409: checking for any_errors_fatal 41684 1727204481.87412: done checking for any_errors_fatal 41684 1727204481.87412: checking for max_fail_percentage 41684 1727204481.87413: done checking for max_fail_percentage 41684 1727204481.87414: checking to see if all hosts have failed and the running result is not ok 41684 1727204481.87415: done checking to see if all hosts have failed 41684 1727204481.87416: getting the remaining hosts for this loop 41684 1727204481.87417: done getting the remaining hosts for this loop 41684 1727204481.87420: getting the next task for host managed-node1 41684 1727204481.87424: done getting next task for host managed-node1 41684 1727204481.87427: ^ task is: TASK: Delete interface1 41684 1727204481.87429: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204481.87437: getting variables 41684 1727204481.87438: in VariableManager get_vars() 41684 1727204481.87454: Calling all_inventory to load vars for managed-node1 41684 1727204481.87456: Calling groups_inventory to load vars for managed-node1 41684 1727204481.87458: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204481.87465: Calling all_plugins_play to load vars for managed-node1 41684 1727204481.87468: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204481.87471: Calling groups_plugins_play to load vars for managed-node1 41684 1727204481.88660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204481.90343: done with get_vars() 41684 1727204481.90372: done getting variables TASK [Delete interface1] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:151 Tuesday 24 September 2024 15:01:21 -0400 (0:00:00.487) 0:00:38.306 ***** 41684 1727204481.90459: entering _queue_task() for managed-node1/include_tasks 41684 1727204481.90781: worker is 1 (out of 1 available) 41684 1727204481.90794: exiting _queue_task() for managed-node1/include_tasks 41684 1727204481.90807: done queuing things up, now waiting for results queue to drain 41684 1727204481.90808: waiting for pending results... 41684 1727204481.90990: running TaskExecutor() for managed-node1/TASK: Delete interface1 41684 1727204481.91078: in run() - task 0affcd87-79f5-3839-086d-0000000000b5 41684 1727204481.91089: variable 'ansible_search_path' from source: unknown 41684 1727204481.91121: calling self._execute() 41684 1727204481.91196: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204481.91200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204481.91208: variable 'omit' from source: magic vars 41684 1727204481.91482: variable 'ansible_distribution_major_version' from source: facts 41684 1727204481.91495: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204481.91500: _execute() done 41684 1727204481.91503: dumping result to json 41684 1727204481.91507: done dumping result, returning 41684 1727204481.91513: done running TaskExecutor() for managed-node1/TASK: Delete interface1 [0affcd87-79f5-3839-086d-0000000000b5] 41684 1727204481.91519: sending task result for task 0affcd87-79f5-3839-086d-0000000000b5 41684 1727204481.91608: done sending task result for task 0affcd87-79f5-3839-086d-0000000000b5 41684 1727204481.91611: WORKER PROCESS EXITING 41684 1727204481.91641: no more pending results, returning what we have 41684 1727204481.91647: in VariableManager get_vars() 41684 1727204481.91696: Calling all_inventory to load vars for managed-node1 41684 1727204481.91699: Calling groups_inventory to load vars for managed-node1 41684 1727204481.91701: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204481.91715: Calling all_plugins_play to load vars for managed-node1 41684 1727204481.91718: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204481.91728: Calling groups_plugins_play to load vars for managed-node1 41684 1727204481.92882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204481.94616: done with get_vars() 41684 1727204481.94635: variable 'ansible_search_path' from source: unknown 41684 1727204481.94648: we have included files to process 41684 1727204481.94650: generating all_blocks data 41684 1727204481.94651: done generating all_blocks data 41684 1727204481.94657: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 41684 1727204481.94658: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 41684 1727204481.94661: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 41684 1727204481.94897: done processing included file 41684 1727204481.94899: iterating over new_blocks loaded from include file 41684 1727204481.94901: in VariableManager get_vars() 41684 1727204481.94920: done with get_vars() 41684 1727204481.94922: filtering new block on tags 41684 1727204481.94949: done filtering new block on tags 41684 1727204481.94951: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node1 41684 1727204481.94956: extending task lists for all hosts with included blocks 41684 1727204481.95794: done extending task lists 41684 1727204481.95795: done processing included files 41684 1727204481.95796: results queue empty 41684 1727204481.95796: checking for any_errors_fatal 41684 1727204481.95797: done checking for any_errors_fatal 41684 1727204481.95798: checking for max_fail_percentage 41684 1727204481.95799: done checking for max_fail_percentage 41684 1727204481.95799: checking to see if all hosts have failed and the running result is not ok 41684 1727204481.95800: done checking to see if all hosts have failed 41684 1727204481.95800: getting the remaining hosts for this loop 41684 1727204481.95801: done getting the remaining hosts for this loop 41684 1727204481.95803: getting the next task for host managed-node1 41684 1727204481.95806: done getting next task for host managed-node1 41684 1727204481.95807: ^ task is: TASK: Remove test interface if necessary 41684 1727204481.95809: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204481.95811: getting variables 41684 1727204481.95812: in VariableManager get_vars() 41684 1727204481.95822: Calling all_inventory to load vars for managed-node1 41684 1727204481.95824: Calling groups_inventory to load vars for managed-node1 41684 1727204481.95826: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204481.95830: Calling all_plugins_play to load vars for managed-node1 41684 1727204481.95832: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204481.95833: Calling groups_plugins_play to load vars for managed-node1 41684 1727204481.96524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204481.97936: done with get_vars() 41684 1727204481.97958: done getting variables 41684 1727204481.98009: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 15:01:21 -0400 (0:00:00.075) 0:00:38.381 ***** 41684 1727204481.98048: entering _queue_task() for managed-node1/command 41684 1727204481.98308: worker is 1 (out of 1 available) 41684 1727204481.98320: exiting _queue_task() for managed-node1/command 41684 1727204481.98334: done queuing things up, now waiting for results queue to drain 41684 1727204481.98336: waiting for pending results... 41684 1727204481.98517: running TaskExecutor() for managed-node1/TASK: Remove test interface if necessary 41684 1727204481.98596: in run() - task 0affcd87-79f5-3839-086d-000000000777 41684 1727204481.98608: variable 'ansible_search_path' from source: unknown 41684 1727204481.98611: variable 'ansible_search_path' from source: unknown 41684 1727204481.98639: calling self._execute() 41684 1727204481.98710: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204481.98714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204481.98722: variable 'omit' from source: magic vars 41684 1727204481.98997: variable 'ansible_distribution_major_version' from source: facts 41684 1727204481.99014: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204481.99024: variable 'omit' from source: magic vars 41684 1727204481.99056: variable 'omit' from source: magic vars 41684 1727204481.99130: variable 'interface' from source: set_fact 41684 1727204481.99144: variable 'omit' from source: magic vars 41684 1727204481.99180: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204481.99207: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204481.99229: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204481.99245: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204481.99254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204481.99278: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204481.99281: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204481.99284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204481.99352: Set connection var ansible_connection to ssh 41684 1727204481.99356: Set connection var ansible_pipelining to False 41684 1727204481.99365: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204481.99369: Set connection var ansible_timeout to 10 41684 1727204481.99376: Set connection var ansible_shell_executable to /bin/sh 41684 1727204481.99378: Set connection var ansible_shell_type to sh 41684 1727204481.99396: variable 'ansible_shell_executable' from source: unknown 41684 1727204481.99400: variable 'ansible_connection' from source: unknown 41684 1727204481.99402: variable 'ansible_module_compression' from source: unknown 41684 1727204481.99405: variable 'ansible_shell_type' from source: unknown 41684 1727204481.99408: variable 'ansible_shell_executable' from source: unknown 41684 1727204481.99410: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204481.99412: variable 'ansible_pipelining' from source: unknown 41684 1727204481.99414: variable 'ansible_timeout' from source: unknown 41684 1727204481.99418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204481.99518: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204481.99527: variable 'omit' from source: magic vars 41684 1727204481.99532: starting attempt loop 41684 1727204481.99534: running the handler 41684 1727204481.99549: _low_level_execute_command(): starting 41684 1727204481.99561: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204482.00071: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204482.00095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204482.00109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 41684 1727204482.00126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204482.00168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204482.00181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204482.00246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204482.01772: stdout chunk (state=3): >>>/root <<< 41684 1727204482.01871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204482.01918: stderr chunk (state=3): >>><<< 41684 1727204482.01924: stdout chunk (state=3): >>><<< 41684 1727204482.01948: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204482.01959: _low_level_execute_command(): starting 41684 1727204482.01968: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204482.0194778-44505-260777415038053 `" && echo ansible-tmp-1727204482.0194778-44505-260777415038053="` echo /root/.ansible/tmp/ansible-tmp-1727204482.0194778-44505-260777415038053 `" ) && sleep 0' 41684 1727204482.02542: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204482.02551: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204482.02567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204482.02581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204482.02626: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204482.02648: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204482.02665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204482.02685: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204482.02702: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204482.02717: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204482.02732: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204482.02748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204482.02768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204482.02782: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204482.02794: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204482.02808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204482.02888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204482.02905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204482.02918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204482.03006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204482.04834: stdout chunk (state=3): >>>ansible-tmp-1727204482.0194778-44505-260777415038053=/root/.ansible/tmp/ansible-tmp-1727204482.0194778-44505-260777415038053 <<< 41684 1727204482.04954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204482.05040: stderr chunk (state=3): >>><<< 41684 1727204482.05043: stdout chunk (state=3): >>><<< 41684 1727204482.05269: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204482.0194778-44505-260777415038053=/root/.ansible/tmp/ansible-tmp-1727204482.0194778-44505-260777415038053 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204482.05272: variable 'ansible_module_compression' from source: unknown 41684 1727204482.05274: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204482.05276: variable 'ansible_facts' from source: unknown 41684 1727204482.05393: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204482.0194778-44505-260777415038053/AnsiballZ_command.py 41684 1727204482.05549: Sending initial data 41684 1727204482.05552: Sent initial data (156 bytes) 41684 1727204482.06526: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204482.06539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204482.06552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204482.06572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204482.06618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204482.06632: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204482.06646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204482.06662: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204482.06676: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204482.06686: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204482.06702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204482.06715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204482.06730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204482.06741: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204482.06751: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204482.06763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204482.06847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204482.06863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204482.06880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204482.06970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204482.08708: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204482.08771: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204482.08827: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmp2d614q1_ /root/.ansible/tmp/ansible-tmp-1727204482.0194778-44505-260777415038053/AnsiballZ_command.py <<< 41684 1727204482.08874: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204482.10116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204482.10298: stderr chunk (state=3): >>><<< 41684 1727204482.10301: stdout chunk (state=3): >>><<< 41684 1727204482.10303: done transferring module to remote 41684 1727204482.10305: _low_level_execute_command(): starting 41684 1727204482.10307: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204482.0194778-44505-260777415038053/ /root/.ansible/tmp/ansible-tmp-1727204482.0194778-44505-260777415038053/AnsiballZ_command.py && sleep 0' 41684 1727204482.10899: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204482.10914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204482.10930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204482.10953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204482.11004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204482.11016: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204482.11030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204482.11046: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204482.11058: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204482.11074: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204482.11086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204482.11099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204482.11113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204482.11124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204482.11134: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204482.11146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204482.11225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204482.11247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204482.11267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204482.11351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204482.13155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204482.13253: stderr chunk (state=3): >>><<< 41684 1727204482.13275: stdout chunk (state=3): >>><<< 41684 1727204482.13384: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204482.13389: _low_level_execute_command(): starting 41684 1727204482.13391: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204482.0194778-44505-260777415038053/AnsiballZ_command.py && sleep 0' 41684 1727204482.14107: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204482.14131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204482.14147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204482.14177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204482.14220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204482.14233: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204482.14248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204482.14275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204482.14289: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204482.14300: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204482.14322: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204482.14337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204482.14353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204482.14371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204482.14389: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204482.14404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204482.14483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204482.14511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204482.14528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204482.14629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204482.29353: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest1"], "start": "2024-09-24 15:01:22.276194", "end": "2024-09-24 15:01:22.292445", "delta": "0:00:00.016251", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 41684 1727204482.29425: stdout chunk (state=3): >>> <<< 41684 1727204482.30748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204482.30752: stdout chunk (state=3): >>><<< 41684 1727204482.30754: stderr chunk (state=3): >>><<< 41684 1727204482.30904: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest1"], "start": "2024-09-24 15:01:22.276194", "end": "2024-09-24 15:01:22.292445", "delta": "0:00:00.016251", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204482.30914: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204482.0194778-44505-260777415038053/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204482.30918: _low_level_execute_command(): starting 41684 1727204482.30920: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204482.0194778-44505-260777415038053/ > /dev/null 2>&1 && sleep 0' 41684 1727204482.32131: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204482.32134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204482.32171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204482.32174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204482.32177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204482.32233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204482.32488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204482.32491: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204482.33247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204482.35048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204482.35145: stderr chunk (state=3): >>><<< 41684 1727204482.35149: stdout chunk (state=3): >>><<< 41684 1727204482.35475: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204482.35479: handler run complete 41684 1727204482.35481: Evaluated conditional (False): False 41684 1727204482.35484: attempt loop complete, returning result 41684 1727204482.35486: _execute() done 41684 1727204482.35488: dumping result to json 41684 1727204482.35490: done dumping result, returning 41684 1727204482.35492: done running TaskExecutor() for managed-node1/TASK: Remove test interface if necessary [0affcd87-79f5-3839-086d-000000000777] 41684 1727204482.35493: sending task result for task 0affcd87-79f5-3839-086d-000000000777 41684 1727204482.35571: done sending task result for task 0affcd87-79f5-3839-086d-000000000777 41684 1727204482.35574: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest1" ], "delta": "0:00:00.016251", "end": "2024-09-24 15:01:22.292445", "rc": 0, "start": "2024-09-24 15:01:22.276194" } 41684 1727204482.35648: no more pending results, returning what we have 41684 1727204482.35653: results queue empty 41684 1727204482.35654: checking for any_errors_fatal 41684 1727204482.35656: done checking for any_errors_fatal 41684 1727204482.35657: checking for max_fail_percentage 41684 1727204482.35658: done checking for max_fail_percentage 41684 1727204482.35659: checking to see if all hosts have failed and the running result is not ok 41684 1727204482.35660: done checking to see if all hosts have failed 41684 1727204482.35661: getting the remaining hosts for this loop 41684 1727204482.35667: done getting the remaining hosts for this loop 41684 1727204482.35671: getting the next task for host managed-node1 41684 1727204482.35681: done getting next task for host managed-node1 41684 1727204482.35686: ^ task is: TASK: Assert interface1 is absent 41684 1727204482.35689: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204482.35694: getting variables 41684 1727204482.35696: in VariableManager get_vars() 41684 1727204482.35744: Calling all_inventory to load vars for managed-node1 41684 1727204482.35747: Calling groups_inventory to load vars for managed-node1 41684 1727204482.35750: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204482.35767: Calling all_plugins_play to load vars for managed-node1 41684 1727204482.35770: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204482.35774: Calling groups_plugins_play to load vars for managed-node1 41684 1727204482.49944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204482.51735: done with get_vars() 41684 1727204482.51776: done getting variables TASK [Assert interface1 is absent] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:153 Tuesday 24 September 2024 15:01:22 -0400 (0:00:00.538) 0:00:38.920 ***** 41684 1727204482.51865: entering _queue_task() for managed-node1/include_tasks 41684 1727204482.52242: worker is 1 (out of 1 available) 41684 1727204482.52255: exiting _queue_task() for managed-node1/include_tasks 41684 1727204482.52272: done queuing things up, now waiting for results queue to drain 41684 1727204482.52275: waiting for pending results... 41684 1727204482.52644: running TaskExecutor() for managed-node1/TASK: Assert interface1 is absent 41684 1727204482.52747: in run() - task 0affcd87-79f5-3839-086d-0000000000b6 41684 1727204482.52766: variable 'ansible_search_path' from source: unknown 41684 1727204482.52799: calling self._execute() 41684 1727204482.52899: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204482.52905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204482.52917: variable 'omit' from source: magic vars 41684 1727204482.53347: variable 'ansible_distribution_major_version' from source: facts 41684 1727204482.53359: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204482.53368: _execute() done 41684 1727204482.53372: dumping result to json 41684 1727204482.53375: done dumping result, returning 41684 1727204482.53385: done running TaskExecutor() for managed-node1/TASK: Assert interface1 is absent [0affcd87-79f5-3839-086d-0000000000b6] 41684 1727204482.53397: sending task result for task 0affcd87-79f5-3839-086d-0000000000b6 41684 1727204482.53496: done sending task result for task 0affcd87-79f5-3839-086d-0000000000b6 41684 1727204482.53499: WORKER PROCESS EXITING 41684 1727204482.53529: no more pending results, returning what we have 41684 1727204482.53535: in VariableManager get_vars() 41684 1727204482.53587: Calling all_inventory to load vars for managed-node1 41684 1727204482.53591: Calling groups_inventory to load vars for managed-node1 41684 1727204482.53593: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204482.53610: Calling all_plugins_play to load vars for managed-node1 41684 1727204482.53614: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204482.53617: Calling groups_plugins_play to load vars for managed-node1 41684 1727204482.56192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204482.58341: done with get_vars() 41684 1727204482.58368: variable 'ansible_search_path' from source: unknown 41684 1727204482.58385: we have included files to process 41684 1727204482.58386: generating all_blocks data 41684 1727204482.58388: done generating all_blocks data 41684 1727204482.58394: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 41684 1727204482.58395: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 41684 1727204482.58398: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 41684 1727204482.58673: in VariableManager get_vars() 41684 1727204482.58698: done with get_vars() 41684 1727204482.58816: done processing included file 41684 1727204482.58818: iterating over new_blocks loaded from include file 41684 1727204482.58819: in VariableManager get_vars() 41684 1727204482.58834: done with get_vars() 41684 1727204482.58835: filtering new block on tags 41684 1727204482.58867: done filtering new block on tags 41684 1727204482.58870: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node1 41684 1727204482.58874: extending task lists for all hosts with included blocks 41684 1727204482.60450: done extending task lists 41684 1727204482.60452: done processing included files 41684 1727204482.60453: results queue empty 41684 1727204482.60454: checking for any_errors_fatal 41684 1727204482.60459: done checking for any_errors_fatal 41684 1727204482.60460: checking for max_fail_percentage 41684 1727204482.60461: done checking for max_fail_percentage 41684 1727204482.60465: checking to see if all hosts have failed and the running result is not ok 41684 1727204482.60466: done checking to see if all hosts have failed 41684 1727204482.60467: getting the remaining hosts for this loop 41684 1727204482.60468: done getting the remaining hosts for this loop 41684 1727204482.60471: getting the next task for host managed-node1 41684 1727204482.60476: done getting next task for host managed-node1 41684 1727204482.60478: ^ task is: TASK: Include the task 'get_interface_stat.yml' 41684 1727204482.60481: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204482.60484: getting variables 41684 1727204482.60485: in VariableManager get_vars() 41684 1727204482.60500: Calling all_inventory to load vars for managed-node1 41684 1727204482.60503: Calling groups_inventory to load vars for managed-node1 41684 1727204482.60505: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204482.60511: Calling all_plugins_play to load vars for managed-node1 41684 1727204482.60513: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204482.60516: Calling groups_plugins_play to load vars for managed-node1 41684 1727204482.63558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204482.67208: done with get_vars() 41684 1727204482.67357: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:01:22 -0400 (0:00:00.155) 0:00:39.076 ***** 41684 1727204482.67450: entering _queue_task() for managed-node1/include_tasks 41684 1727204482.67943: worker is 1 (out of 1 available) 41684 1727204482.67955: exiting _queue_task() for managed-node1/include_tasks 41684 1727204482.67970: done queuing things up, now waiting for results queue to drain 41684 1727204482.67972: waiting for pending results... 41684 1727204482.68281: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 41684 1727204482.68402: in run() - task 0affcd87-79f5-3839-086d-000000000816 41684 1727204482.68418: variable 'ansible_search_path' from source: unknown 41684 1727204482.68423: variable 'ansible_search_path' from source: unknown 41684 1727204482.68466: calling self._execute() 41684 1727204482.68567: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204482.68572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204482.68581: variable 'omit' from source: magic vars 41684 1727204482.68977: variable 'ansible_distribution_major_version' from source: facts 41684 1727204482.68994: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204482.69001: _execute() done 41684 1727204482.69004: dumping result to json 41684 1727204482.69007: done dumping result, returning 41684 1727204482.69013: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-3839-086d-000000000816] 41684 1727204482.69020: sending task result for task 0affcd87-79f5-3839-086d-000000000816 41684 1727204482.69145: no more pending results, returning what we have 41684 1727204482.69151: in VariableManager get_vars() 41684 1727204482.69206: Calling all_inventory to load vars for managed-node1 41684 1727204482.69209: Calling groups_inventory to load vars for managed-node1 41684 1727204482.69212: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204482.69228: Calling all_plugins_play to load vars for managed-node1 41684 1727204482.69231: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204482.69235: Calling groups_plugins_play to load vars for managed-node1 41684 1727204482.69760: done sending task result for task 0affcd87-79f5-3839-086d-000000000816 41684 1727204482.69769: WORKER PROCESS EXITING 41684 1727204482.71300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204482.73276: done with get_vars() 41684 1727204482.73296: variable 'ansible_search_path' from source: unknown 41684 1727204482.73298: variable 'ansible_search_path' from source: unknown 41684 1727204482.73340: we have included files to process 41684 1727204482.73342: generating all_blocks data 41684 1727204482.73343: done generating all_blocks data 41684 1727204482.73344: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41684 1727204482.73345: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41684 1727204482.73347: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41684 1727204482.73560: done processing included file 41684 1727204482.73562: iterating over new_blocks loaded from include file 41684 1727204482.73566: in VariableManager get_vars() 41684 1727204482.73586: done with get_vars() 41684 1727204482.73588: filtering new block on tags 41684 1727204482.73612: done filtering new block on tags 41684 1727204482.73614: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 41684 1727204482.73619: extending task lists for all hosts with included blocks 41684 1727204482.73792: done extending task lists 41684 1727204482.73794: done processing included files 41684 1727204482.73795: results queue empty 41684 1727204482.73796: checking for any_errors_fatal 41684 1727204482.73799: done checking for any_errors_fatal 41684 1727204482.73800: checking for max_fail_percentage 41684 1727204482.73801: done checking for max_fail_percentage 41684 1727204482.73802: checking to see if all hosts have failed and the running result is not ok 41684 1727204482.73802: done checking to see if all hosts have failed 41684 1727204482.73803: getting the remaining hosts for this loop 41684 1727204482.73804: done getting the remaining hosts for this loop 41684 1727204482.73807: getting the next task for host managed-node1 41684 1727204482.73811: done getting next task for host managed-node1 41684 1727204482.73820: ^ task is: TASK: Get stat for interface {{ interface }} 41684 1727204482.73824: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204482.73826: getting variables 41684 1727204482.73827: in VariableManager get_vars() 41684 1727204482.73840: Calling all_inventory to load vars for managed-node1 41684 1727204482.73842: Calling groups_inventory to load vars for managed-node1 41684 1727204482.73844: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204482.73849: Calling all_plugins_play to load vars for managed-node1 41684 1727204482.73851: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204482.73853: Calling groups_plugins_play to load vars for managed-node1 41684 1727204482.75660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204482.77530: done with get_vars() 41684 1727204482.77576: done getting variables 41684 1727204482.77750: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest1] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:01:22 -0400 (0:00:00.103) 0:00:39.179 ***** 41684 1727204482.77790: entering _queue_task() for managed-node1/stat 41684 1727204482.78137: worker is 1 (out of 1 available) 41684 1727204482.78150: exiting _queue_task() for managed-node1/stat 41684 1727204482.78162: done queuing things up, now waiting for results queue to drain 41684 1727204482.78168: waiting for pending results... 41684 1727204482.78469: running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest1 41684 1727204482.78584: in run() - task 0affcd87-79f5-3839-086d-0000000008bc 41684 1727204482.78599: variable 'ansible_search_path' from source: unknown 41684 1727204482.78602: variable 'ansible_search_path' from source: unknown 41684 1727204482.78642: calling self._execute() 41684 1727204482.78744: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204482.78748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204482.78768: variable 'omit' from source: magic vars 41684 1727204482.79149: variable 'ansible_distribution_major_version' from source: facts 41684 1727204482.79170: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204482.79176: variable 'omit' from source: magic vars 41684 1727204482.79251: variable 'omit' from source: magic vars 41684 1727204482.79425: variable 'interface' from source: set_fact 41684 1727204482.80257: variable 'omit' from source: magic vars 41684 1727204482.80261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204482.80271: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204482.80273: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204482.80276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204482.80279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204482.80281: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204482.80283: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204482.80285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204482.80287: Set connection var ansible_connection to ssh 41684 1727204482.80289: Set connection var ansible_pipelining to False 41684 1727204482.80291: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204482.80293: Set connection var ansible_timeout to 10 41684 1727204482.80295: Set connection var ansible_shell_executable to /bin/sh 41684 1727204482.80297: Set connection var ansible_shell_type to sh 41684 1727204482.80304: variable 'ansible_shell_executable' from source: unknown 41684 1727204482.80306: variable 'ansible_connection' from source: unknown 41684 1727204482.80308: variable 'ansible_module_compression' from source: unknown 41684 1727204482.80310: variable 'ansible_shell_type' from source: unknown 41684 1727204482.80312: variable 'ansible_shell_executable' from source: unknown 41684 1727204482.80314: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204482.80316: variable 'ansible_pipelining' from source: unknown 41684 1727204482.80318: variable 'ansible_timeout' from source: unknown 41684 1727204482.80320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204482.80323: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204482.80325: variable 'omit' from source: magic vars 41684 1727204482.80327: starting attempt loop 41684 1727204482.80330: running the handler 41684 1727204482.80332: _low_level_execute_command(): starting 41684 1727204482.80334: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204482.81000: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204482.81012: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204482.81023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204482.81036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204482.81075: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204482.81089: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204482.81100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204482.81113: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204482.81122: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204482.81129: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204482.81136: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204482.81145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204482.81157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204482.81170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204482.81173: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204482.81182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204482.81260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204482.81281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204482.81294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204482.81386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204482.83028: stdout chunk (state=3): >>>/root <<< 41684 1727204482.83187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204482.83212: stderr chunk (state=3): >>><<< 41684 1727204482.83215: stdout chunk (state=3): >>><<< 41684 1727204482.83242: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204482.83256: _low_level_execute_command(): starting 41684 1727204482.83259: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204482.832409-44541-131215439056756 `" && echo ansible-tmp-1727204482.832409-44541-131215439056756="` echo /root/.ansible/tmp/ansible-tmp-1727204482.832409-44541-131215439056756 `" ) && sleep 0' 41684 1727204482.83979: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204482.83982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204482.83985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204482.83987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204482.83989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204482.83991: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204482.83997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204482.84071: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204482.84075: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204482.84077: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204482.84080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204482.84082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204482.84084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204482.84086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204482.84088: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204482.84090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204482.84158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204482.84177: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204482.84180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204482.84260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204482.86103: stdout chunk (state=3): >>>ansible-tmp-1727204482.832409-44541-131215439056756=/root/.ansible/tmp/ansible-tmp-1727204482.832409-44541-131215439056756 <<< 41684 1727204482.86220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204482.86269: stderr chunk (state=3): >>><<< 41684 1727204482.86273: stdout chunk (state=3): >>><<< 41684 1727204482.86286: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204482.832409-44541-131215439056756=/root/.ansible/tmp/ansible-tmp-1727204482.832409-44541-131215439056756 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204482.86327: variable 'ansible_module_compression' from source: unknown 41684 1727204482.86377: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 41684 1727204482.86408: variable 'ansible_facts' from source: unknown 41684 1727204482.86472: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204482.832409-44541-131215439056756/AnsiballZ_stat.py 41684 1727204482.86583: Sending initial data 41684 1727204482.86586: Sent initial data (152 bytes) 41684 1727204482.87503: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204482.87531: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204482.87615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204482.89332: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204482.89390: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204482.89443: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpeqkv36lo /root/.ansible/tmp/ansible-tmp-1727204482.832409-44541-131215439056756/AnsiballZ_stat.py <<< 41684 1727204482.89492: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204482.90382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204482.90469: stderr chunk (state=3): >>><<< 41684 1727204482.90473: stdout chunk (state=3): >>><<< 41684 1727204482.90475: done transferring module to remote 41684 1727204482.90477: _low_level_execute_command(): starting 41684 1727204482.90480: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204482.832409-44541-131215439056756/ /root/.ansible/tmp/ansible-tmp-1727204482.832409-44541-131215439056756/AnsiballZ_stat.py && sleep 0' 41684 1727204482.91087: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204482.91096: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204482.91106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204482.91119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204482.91160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204482.91172: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204482.91180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204482.91192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204482.91200: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204482.91207: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204482.91214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204482.91223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204482.91233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204482.91241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204482.91249: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204482.91264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204482.91335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204482.91349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204482.91355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204482.91444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204482.93137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204482.93183: stderr chunk (state=3): >>><<< 41684 1727204482.93186: stdout chunk (state=3): >>><<< 41684 1727204482.93201: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204482.93205: _low_level_execute_command(): starting 41684 1727204482.93209: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204482.832409-44541-131215439056756/AnsiballZ_stat.py && sleep 0' 41684 1727204482.93622: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204482.93629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204482.93666: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204482.93674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204482.93680: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204482.93689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204482.93696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204482.93702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204482.93709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204482.93763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204482.93783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204482.93850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204483.06781: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}}<<< 41684 1727204483.06787: stdout chunk (state=3): >>> <<< 41684 1727204483.07803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204483.07807: stdout chunk (state=3): >>><<< 41684 1727204483.07809: stderr chunk (state=3): >>><<< 41684 1727204483.07947: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204483.07952: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204482.832409-44541-131215439056756/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204483.07956: _low_level_execute_command(): starting 41684 1727204483.07958: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204482.832409-44541-131215439056756/ > /dev/null 2>&1 && sleep 0' 41684 1727204483.08593: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204483.08607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204483.08622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204483.08645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204483.08692: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204483.08704: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204483.08716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.08735: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204483.08750: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204483.08760: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204483.08776: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204483.08789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204483.08803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204483.08813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204483.08823: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204483.08835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.08921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204483.08943: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204483.08966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204483.09050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204483.10787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204483.10881: stderr chunk (state=3): >>><<< 41684 1727204483.10891: stdout chunk (state=3): >>><<< 41684 1727204483.11074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204483.11077: handler run complete 41684 1727204483.11080: attempt loop complete, returning result 41684 1727204483.11082: _execute() done 41684 1727204483.11084: dumping result to json 41684 1727204483.11086: done dumping result, returning 41684 1727204483.11088: done running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest1 [0affcd87-79f5-3839-086d-0000000008bc] 41684 1727204483.11090: sending task result for task 0affcd87-79f5-3839-086d-0000000008bc 41684 1727204483.11174: done sending task result for task 0affcd87-79f5-3839-086d-0000000008bc 41684 1727204483.11180: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 41684 1727204483.11247: no more pending results, returning what we have 41684 1727204483.11252: results queue empty 41684 1727204483.11253: checking for any_errors_fatal 41684 1727204483.11255: done checking for any_errors_fatal 41684 1727204483.11256: checking for max_fail_percentage 41684 1727204483.11258: done checking for max_fail_percentage 41684 1727204483.11258: checking to see if all hosts have failed and the running result is not ok 41684 1727204483.11259: done checking to see if all hosts have failed 41684 1727204483.11260: getting the remaining hosts for this loop 41684 1727204483.11266: done getting the remaining hosts for this loop 41684 1727204483.11270: getting the next task for host managed-node1 41684 1727204483.11279: done getting next task for host managed-node1 41684 1727204483.11282: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 41684 1727204483.11286: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204483.11293: getting variables 41684 1727204483.11294: in VariableManager get_vars() 41684 1727204483.11341: Calling all_inventory to load vars for managed-node1 41684 1727204483.11344: Calling groups_inventory to load vars for managed-node1 41684 1727204483.11346: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204483.11359: Calling all_plugins_play to load vars for managed-node1 41684 1727204483.11365: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204483.11369: Calling groups_plugins_play to load vars for managed-node1 41684 1727204483.13315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204483.15137: done with get_vars() 41684 1727204483.15172: done getting variables 41684 1727204483.15236: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204483.15373: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest1'] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:01:23 -0400 (0:00:00.376) 0:00:39.555 ***** 41684 1727204483.15410: entering _queue_task() for managed-node1/assert 41684 1727204483.15780: worker is 1 (out of 1 available) 41684 1727204483.15794: exiting _queue_task() for managed-node1/assert 41684 1727204483.15808: done queuing things up, now waiting for results queue to drain 41684 1727204483.15809: waiting for pending results... 41684 1727204483.16120: running TaskExecutor() for managed-node1/TASK: Assert that the interface is absent - 'ethtest1' 41684 1727204483.16242: in run() - task 0affcd87-79f5-3839-086d-000000000817 41684 1727204483.16274: variable 'ansible_search_path' from source: unknown 41684 1727204483.16281: variable 'ansible_search_path' from source: unknown 41684 1727204483.16320: calling self._execute() 41684 1727204483.16426: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204483.16437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204483.16453: variable 'omit' from source: magic vars 41684 1727204483.16852: variable 'ansible_distribution_major_version' from source: facts 41684 1727204483.16874: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204483.16885: variable 'omit' from source: magic vars 41684 1727204483.16948: variable 'omit' from source: magic vars 41684 1727204483.17057: variable 'interface' from source: set_fact 41684 1727204483.17084: variable 'omit' from source: magic vars 41684 1727204483.17138: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204483.17184: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204483.17210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204483.17238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204483.17257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204483.17294: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204483.17302: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204483.17309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204483.17419: Set connection var ansible_connection to ssh 41684 1727204483.17429: Set connection var ansible_pipelining to False 41684 1727204483.17438: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204483.17451: Set connection var ansible_timeout to 10 41684 1727204483.17471: Set connection var ansible_shell_executable to /bin/sh 41684 1727204483.17478: Set connection var ansible_shell_type to sh 41684 1727204483.17508: variable 'ansible_shell_executable' from source: unknown 41684 1727204483.17516: variable 'ansible_connection' from source: unknown 41684 1727204483.17522: variable 'ansible_module_compression' from source: unknown 41684 1727204483.17528: variable 'ansible_shell_type' from source: unknown 41684 1727204483.17534: variable 'ansible_shell_executable' from source: unknown 41684 1727204483.17539: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204483.17546: variable 'ansible_pipelining' from source: unknown 41684 1727204483.17556: variable 'ansible_timeout' from source: unknown 41684 1727204483.17571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204483.17715: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204483.17731: variable 'omit' from source: magic vars 41684 1727204483.17740: starting attempt loop 41684 1727204483.17746: running the handler 41684 1727204483.17919: variable 'interface_stat' from source: set_fact 41684 1727204483.17934: Evaluated conditional (not interface_stat.stat.exists): True 41684 1727204483.17944: handler run complete 41684 1727204483.17967: attempt loop complete, returning result 41684 1727204483.17976: _execute() done 41684 1727204483.17984: dumping result to json 41684 1727204483.17995: done dumping result, returning 41684 1727204483.18008: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is absent - 'ethtest1' [0affcd87-79f5-3839-086d-000000000817] 41684 1727204483.18019: sending task result for task 0affcd87-79f5-3839-086d-000000000817 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 41684 1727204483.18176: no more pending results, returning what we have 41684 1727204483.18181: results queue empty 41684 1727204483.18182: checking for any_errors_fatal 41684 1727204483.18192: done checking for any_errors_fatal 41684 1727204483.18193: checking for max_fail_percentage 41684 1727204483.18194: done checking for max_fail_percentage 41684 1727204483.18195: checking to see if all hosts have failed and the running result is not ok 41684 1727204483.18196: done checking to see if all hosts have failed 41684 1727204483.18197: getting the remaining hosts for this loop 41684 1727204483.18198: done getting the remaining hosts for this loop 41684 1727204483.18203: getting the next task for host managed-node1 41684 1727204483.18212: done getting next task for host managed-node1 41684 1727204483.18215: ^ task is: TASK: Set interface0 41684 1727204483.18218: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204483.18222: getting variables 41684 1727204483.18224: in VariableManager get_vars() 41684 1727204483.18276: Calling all_inventory to load vars for managed-node1 41684 1727204483.18280: Calling groups_inventory to load vars for managed-node1 41684 1727204483.18282: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204483.18294: Calling all_plugins_play to load vars for managed-node1 41684 1727204483.18296: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204483.18298: Calling groups_plugins_play to load vars for managed-node1 41684 1727204483.18893: done sending task result for task 0affcd87-79f5-3839-086d-000000000817 41684 1727204483.18896: WORKER PROCESS EXITING 41684 1727204483.19807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204483.21614: done with get_vars() 41684 1727204483.21647: done getting variables 41684 1727204483.21714: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set interface0] ********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:155 Tuesday 24 September 2024 15:01:23 -0400 (0:00:00.063) 0:00:39.618 ***** 41684 1727204483.21746: entering _queue_task() for managed-node1/set_fact 41684 1727204483.22098: worker is 1 (out of 1 available) 41684 1727204483.22112: exiting _queue_task() for managed-node1/set_fact 41684 1727204483.22126: done queuing things up, now waiting for results queue to drain 41684 1727204483.22127: waiting for pending results... 41684 1727204483.22459: running TaskExecutor() for managed-node1/TASK: Set interface0 41684 1727204483.22589: in run() - task 0affcd87-79f5-3839-086d-0000000000b7 41684 1727204483.22614: variable 'ansible_search_path' from source: unknown 41684 1727204483.22658: calling self._execute() 41684 1727204483.22782: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204483.22796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204483.22818: variable 'omit' from source: magic vars 41684 1727204483.23235: variable 'ansible_distribution_major_version' from source: facts 41684 1727204483.23267: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204483.23281: variable 'omit' from source: magic vars 41684 1727204483.23323: variable 'omit' from source: magic vars 41684 1727204483.23361: variable 'interface0' from source: play vars 41684 1727204483.23444: variable 'interface0' from source: play vars 41684 1727204483.23478: variable 'omit' from source: magic vars 41684 1727204483.23526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204483.23573: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204483.23604: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204483.23627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204483.23645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204483.23689: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204483.23703: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204483.23712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204483.23830: Set connection var ansible_connection to ssh 41684 1727204483.23842: Set connection var ansible_pipelining to False 41684 1727204483.23853: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204483.23868: Set connection var ansible_timeout to 10 41684 1727204483.23882: Set connection var ansible_shell_executable to /bin/sh 41684 1727204483.23889: Set connection var ansible_shell_type to sh 41684 1727204483.23926: variable 'ansible_shell_executable' from source: unknown 41684 1727204483.23934: variable 'ansible_connection' from source: unknown 41684 1727204483.23941: variable 'ansible_module_compression' from source: unknown 41684 1727204483.23948: variable 'ansible_shell_type' from source: unknown 41684 1727204483.23955: variable 'ansible_shell_executable' from source: unknown 41684 1727204483.23967: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204483.23977: variable 'ansible_pipelining' from source: unknown 41684 1727204483.23985: variable 'ansible_timeout' from source: unknown 41684 1727204483.23992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204483.24156: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204483.24179: variable 'omit' from source: magic vars 41684 1727204483.24189: starting attempt loop 41684 1727204483.24196: running the handler 41684 1727204483.24212: handler run complete 41684 1727204483.24233: attempt loop complete, returning result 41684 1727204483.24241: _execute() done 41684 1727204483.24252: dumping result to json 41684 1727204483.24259: done dumping result, returning 41684 1727204483.24276: done running TaskExecutor() for managed-node1/TASK: Set interface0 [0affcd87-79f5-3839-086d-0000000000b7] 41684 1727204483.24289: sending task result for task 0affcd87-79f5-3839-086d-0000000000b7 ok: [managed-node1] => { "ansible_facts": { "interface": "ethtest0" }, "changed": false } 41684 1727204483.24446: no more pending results, returning what we have 41684 1727204483.24450: results queue empty 41684 1727204483.24452: checking for any_errors_fatal 41684 1727204483.24459: done checking for any_errors_fatal 41684 1727204483.24460: checking for max_fail_percentage 41684 1727204483.24465: done checking for max_fail_percentage 41684 1727204483.24466: checking to see if all hosts have failed and the running result is not ok 41684 1727204483.24467: done checking to see if all hosts have failed 41684 1727204483.24468: getting the remaining hosts for this loop 41684 1727204483.24470: done getting the remaining hosts for this loop 41684 1727204483.24475: getting the next task for host managed-node1 41684 1727204483.24483: done getting next task for host managed-node1 41684 1727204483.24486: ^ task is: TASK: Delete interface0 41684 1727204483.24489: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204483.24493: getting variables 41684 1727204483.24495: in VariableManager get_vars() 41684 1727204483.24539: Calling all_inventory to load vars for managed-node1 41684 1727204483.24542: Calling groups_inventory to load vars for managed-node1 41684 1727204483.24545: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204483.24556: Calling all_plugins_play to load vars for managed-node1 41684 1727204483.24560: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204483.24571: Calling groups_plugins_play to load vars for managed-node1 41684 1727204483.25587: done sending task result for task 0affcd87-79f5-3839-086d-0000000000b7 41684 1727204483.25591: WORKER PROCESS EXITING 41684 1727204483.25934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204483.26885: done with get_vars() 41684 1727204483.26904: done getting variables TASK [Delete interface0] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:158 Tuesday 24 September 2024 15:01:23 -0400 (0:00:00.052) 0:00:39.671 ***** 41684 1727204483.27013: entering _queue_task() for managed-node1/include_tasks 41684 1727204483.27347: worker is 1 (out of 1 available) 41684 1727204483.27361: exiting _queue_task() for managed-node1/include_tasks 41684 1727204483.27379: done queuing things up, now waiting for results queue to drain 41684 1727204483.27380: waiting for pending results... 41684 1727204483.27683: running TaskExecutor() for managed-node1/TASK: Delete interface0 41684 1727204483.27805: in run() - task 0affcd87-79f5-3839-086d-0000000000b8 41684 1727204483.27832: variable 'ansible_search_path' from source: unknown 41684 1727204483.27880: calling self._execute() 41684 1727204483.27990: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204483.28002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204483.28018: variable 'omit' from source: magic vars 41684 1727204483.28404: variable 'ansible_distribution_major_version' from source: facts 41684 1727204483.28422: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204483.28429: _execute() done 41684 1727204483.28432: dumping result to json 41684 1727204483.28435: done dumping result, returning 41684 1727204483.28444: done running TaskExecutor() for managed-node1/TASK: Delete interface0 [0affcd87-79f5-3839-086d-0000000000b8] 41684 1727204483.28449: sending task result for task 0affcd87-79f5-3839-086d-0000000000b8 41684 1727204483.28538: done sending task result for task 0affcd87-79f5-3839-086d-0000000000b8 41684 1727204483.28540: WORKER PROCESS EXITING 41684 1727204483.28601: no more pending results, returning what we have 41684 1727204483.28606: in VariableManager get_vars() 41684 1727204483.28656: Calling all_inventory to load vars for managed-node1 41684 1727204483.28659: Calling groups_inventory to load vars for managed-node1 41684 1727204483.28661: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204483.28674: Calling all_plugins_play to load vars for managed-node1 41684 1727204483.28677: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204483.28679: Calling groups_plugins_play to load vars for managed-node1 41684 1727204483.29506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204483.30826: done with get_vars() 41684 1727204483.30847: variable 'ansible_search_path' from source: unknown 41684 1727204483.30866: we have included files to process 41684 1727204483.30868: generating all_blocks data 41684 1727204483.30869: done generating all_blocks data 41684 1727204483.30874: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 41684 1727204483.30875: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 41684 1727204483.30878: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 41684 1727204483.31076: done processing included file 41684 1727204483.31079: iterating over new_blocks loaded from include file 41684 1727204483.31080: in VariableManager get_vars() 41684 1727204483.31102: done with get_vars() 41684 1727204483.31105: filtering new block on tags 41684 1727204483.31129: done filtering new block on tags 41684 1727204483.31131: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node1 41684 1727204483.31136: extending task lists for all hosts with included blocks 41684 1727204483.32608: done extending task lists 41684 1727204483.32609: done processing included files 41684 1727204483.32610: results queue empty 41684 1727204483.32610: checking for any_errors_fatal 41684 1727204483.32613: done checking for any_errors_fatal 41684 1727204483.32613: checking for max_fail_percentage 41684 1727204483.32614: done checking for max_fail_percentage 41684 1727204483.32614: checking to see if all hosts have failed and the running result is not ok 41684 1727204483.32615: done checking to see if all hosts have failed 41684 1727204483.32616: getting the remaining hosts for this loop 41684 1727204483.32617: done getting the remaining hosts for this loop 41684 1727204483.32619: getting the next task for host managed-node1 41684 1727204483.32622: done getting next task for host managed-node1 41684 1727204483.32624: ^ task is: TASK: Remove test interface if necessary 41684 1727204483.32626: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204483.32628: getting variables 41684 1727204483.32628: in VariableManager get_vars() 41684 1727204483.32641: Calling all_inventory to load vars for managed-node1 41684 1727204483.32643: Calling groups_inventory to load vars for managed-node1 41684 1727204483.32644: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204483.32648: Calling all_plugins_play to load vars for managed-node1 41684 1727204483.32650: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204483.32651: Calling groups_plugins_play to load vars for managed-node1 41684 1727204483.33376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204483.34467: done with get_vars() 41684 1727204483.34492: done getting variables 41684 1727204483.34537: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 15:01:23 -0400 (0:00:00.075) 0:00:39.747 ***** 41684 1727204483.34575: entering _queue_task() for managed-node1/command 41684 1727204483.34949: worker is 1 (out of 1 available) 41684 1727204483.34968: exiting _queue_task() for managed-node1/command 41684 1727204483.34982: done queuing things up, now waiting for results queue to drain 41684 1727204483.34983: waiting for pending results... 41684 1727204483.35305: running TaskExecutor() for managed-node1/TASK: Remove test interface if necessary 41684 1727204483.35427: in run() - task 0affcd87-79f5-3839-086d-0000000008da 41684 1727204483.35431: variable 'ansible_search_path' from source: unknown 41684 1727204483.35434: variable 'ansible_search_path' from source: unknown 41684 1727204483.35535: calling self._execute() 41684 1727204483.35645: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204483.35649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204483.35652: variable 'omit' from source: magic vars 41684 1727204483.35982: variable 'ansible_distribution_major_version' from source: facts 41684 1727204483.35995: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204483.36001: variable 'omit' from source: magic vars 41684 1727204483.36054: variable 'omit' from source: magic vars 41684 1727204483.36159: variable 'interface' from source: set_fact 41684 1727204483.36185: variable 'omit' from source: magic vars 41684 1727204483.36236: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204483.36280: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204483.36302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204483.36315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204483.36324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204483.36355: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204483.36359: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204483.36362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204483.36457: Set connection var ansible_connection to ssh 41684 1727204483.36470: Set connection var ansible_pipelining to False 41684 1727204483.36476: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204483.36483: Set connection var ansible_timeout to 10 41684 1727204483.36490: Set connection var ansible_shell_executable to /bin/sh 41684 1727204483.36493: Set connection var ansible_shell_type to sh 41684 1727204483.36517: variable 'ansible_shell_executable' from source: unknown 41684 1727204483.36521: variable 'ansible_connection' from source: unknown 41684 1727204483.36523: variable 'ansible_module_compression' from source: unknown 41684 1727204483.36526: variable 'ansible_shell_type' from source: unknown 41684 1727204483.36529: variable 'ansible_shell_executable' from source: unknown 41684 1727204483.36531: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204483.36533: variable 'ansible_pipelining' from source: unknown 41684 1727204483.36536: variable 'ansible_timeout' from source: unknown 41684 1727204483.36538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204483.36680: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204483.36702: variable 'omit' from source: magic vars 41684 1727204483.36711: starting attempt loop 41684 1727204483.36718: running the handler 41684 1727204483.36745: _low_level_execute_command(): starting 41684 1727204483.36755: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204483.37569: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204483.37586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204483.37606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204483.37630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204483.37675: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204483.37689: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204483.37707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.37726: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204483.37743: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204483.37755: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204483.37771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204483.37785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204483.37800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204483.37817: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204483.37830: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204483.37845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.37930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204483.37948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204483.37969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204483.38061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204483.39749: stdout chunk (state=3): >>>/root <<< 41684 1727204483.39848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204483.39978: stderr chunk (state=3): >>><<< 41684 1727204483.39981: stdout chunk (state=3): >>><<< 41684 1727204483.39987: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204483.39991: _low_level_execute_command(): starting 41684 1727204483.39994: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204483.3994913-44565-75069672286701 `" && echo ansible-tmp-1727204483.3994913-44565-75069672286701="` echo /root/.ansible/tmp/ansible-tmp-1727204483.3994913-44565-75069672286701 `" ) && sleep 0' 41684 1727204483.40607: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204483.40617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204483.40636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204483.40639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204483.40688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204483.40692: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204483.40696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.40711: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204483.40718: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204483.40725: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204483.40733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204483.40742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204483.40753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204483.40760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204483.40787: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204483.40789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.40860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204483.40882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204483.40885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204483.40961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204483.42843: stdout chunk (state=3): >>>ansible-tmp-1727204483.3994913-44565-75069672286701=/root/.ansible/tmp/ansible-tmp-1727204483.3994913-44565-75069672286701 <<< 41684 1727204483.42949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204483.43019: stderr chunk (state=3): >>><<< 41684 1727204483.43022: stdout chunk (state=3): >>><<< 41684 1727204483.43040: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204483.3994913-44565-75069672286701=/root/.ansible/tmp/ansible-tmp-1727204483.3994913-44565-75069672286701 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204483.43074: variable 'ansible_module_compression' from source: unknown 41684 1727204483.43116: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204483.43144: variable 'ansible_facts' from source: unknown 41684 1727204483.43209: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204483.3994913-44565-75069672286701/AnsiballZ_command.py 41684 1727204483.43322: Sending initial data 41684 1727204483.43326: Sent initial data (155 bytes) 41684 1727204483.44034: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204483.44041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204483.44080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.44086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204483.44095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204483.44103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204483.44108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204483.44114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.44175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204483.44184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204483.44191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204483.44266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204483.45984: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204483.46032: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204483.46085: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpmh_odfqq /root/.ansible/tmp/ansible-tmp-1727204483.3994913-44565-75069672286701/AnsiballZ_command.py <<< 41684 1727204483.46136: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204483.46975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204483.47091: stderr chunk (state=3): >>><<< 41684 1727204483.47095: stdout chunk (state=3): >>><<< 41684 1727204483.47113: done transferring module to remote 41684 1727204483.47123: _low_level_execute_command(): starting 41684 1727204483.47127: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204483.3994913-44565-75069672286701/ /root/.ansible/tmp/ansible-tmp-1727204483.3994913-44565-75069672286701/AnsiballZ_command.py && sleep 0' 41684 1727204483.47599: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204483.47608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204483.47658: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204483.47662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204483.47668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.47716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204483.47724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204483.47800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204483.49542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204483.49602: stderr chunk (state=3): >>><<< 41684 1727204483.49605: stdout chunk (state=3): >>><<< 41684 1727204483.49620: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204483.49623: _low_level_execute_command(): starting 41684 1727204483.49628: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204483.3994913-44565-75069672286701/AnsiballZ_command.py && sleep 0' 41684 1727204483.50166: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204483.50183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204483.50201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204483.50220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204483.50262: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204483.50278: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204483.50293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.50315: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204483.50328: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204483.50340: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204483.50353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204483.50370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204483.50389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204483.50401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204483.50412: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204483.50430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.50507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204483.50524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204483.50544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204483.50643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204483.64694: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 15:01:23.637473", "end": "2024-09-24 15:01:23.646108", "delta": "0:00:00.008635", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204483.66085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204483.66144: stderr chunk (state=3): >>><<< 41684 1727204483.66147: stdout chunk (state=3): >>><<< 41684 1727204483.66169: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 15:01:23.637473", "end": "2024-09-24 15:01:23.646108", "delta": "0:00:00.008635", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204483.66200: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204483.3994913-44565-75069672286701/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204483.66210: _low_level_execute_command(): starting 41684 1727204483.66212: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204483.3994913-44565-75069672286701/ > /dev/null 2>&1 && sleep 0' 41684 1727204483.66689: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204483.66695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204483.66743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.66746: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204483.66749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.66799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204483.66812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204483.66883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204483.68648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204483.68703: stderr chunk (state=3): >>><<< 41684 1727204483.68708: stdout chunk (state=3): >>><<< 41684 1727204483.68723: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204483.68728: handler run complete 41684 1727204483.68751: Evaluated conditional (False): False 41684 1727204483.68759: attempt loop complete, returning result 41684 1727204483.68765: _execute() done 41684 1727204483.68768: dumping result to json 41684 1727204483.68770: done dumping result, returning 41684 1727204483.68778: done running TaskExecutor() for managed-node1/TASK: Remove test interface if necessary [0affcd87-79f5-3839-086d-0000000008da] 41684 1727204483.68784: sending task result for task 0affcd87-79f5-3839-086d-0000000008da 41684 1727204483.68884: done sending task result for task 0affcd87-79f5-3839-086d-0000000008da 41684 1727204483.68887: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.008635", "end": "2024-09-24 15:01:23.646108", "rc": 0, "start": "2024-09-24 15:01:23.637473" } 41684 1727204483.68969: no more pending results, returning what we have 41684 1727204483.68974: results queue empty 41684 1727204483.68975: checking for any_errors_fatal 41684 1727204483.68976: done checking for any_errors_fatal 41684 1727204483.68977: checking for max_fail_percentage 41684 1727204483.68978: done checking for max_fail_percentage 41684 1727204483.68979: checking to see if all hosts have failed and the running result is not ok 41684 1727204483.68980: done checking to see if all hosts have failed 41684 1727204483.68981: getting the remaining hosts for this loop 41684 1727204483.68983: done getting the remaining hosts for this loop 41684 1727204483.68987: getting the next task for host managed-node1 41684 1727204483.68994: done getting next task for host managed-node1 41684 1727204483.68999: ^ task is: TASK: Assert interface0 is absent 41684 1727204483.69002: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204483.69006: getting variables 41684 1727204483.69008: in VariableManager get_vars() 41684 1727204483.69046: Calling all_inventory to load vars for managed-node1 41684 1727204483.69048: Calling groups_inventory to load vars for managed-node1 41684 1727204483.69050: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204483.69061: Calling all_plugins_play to load vars for managed-node1 41684 1727204483.69073: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204483.69077: Calling groups_plugins_play to load vars for managed-node1 41684 1727204483.70003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204483.70933: done with get_vars() 41684 1727204483.70950: done getting variables TASK [Assert interface0 is absent] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:160 Tuesday 24 September 2024 15:01:23 -0400 (0:00:00.364) 0:00:40.111 ***** 41684 1727204483.71023: entering _queue_task() for managed-node1/include_tasks 41684 1727204483.71248: worker is 1 (out of 1 available) 41684 1727204483.71261: exiting _queue_task() for managed-node1/include_tasks 41684 1727204483.71277: done queuing things up, now waiting for results queue to drain 41684 1727204483.71278: waiting for pending results... 41684 1727204483.71455: running TaskExecutor() for managed-node1/TASK: Assert interface0 is absent 41684 1727204483.71528: in run() - task 0affcd87-79f5-3839-086d-0000000000b9 41684 1727204483.71541: variable 'ansible_search_path' from source: unknown 41684 1727204483.71576: calling self._execute() 41684 1727204483.71649: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204483.71653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204483.71671: variable 'omit' from source: magic vars 41684 1727204483.71936: variable 'ansible_distribution_major_version' from source: facts 41684 1727204483.71947: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204483.71952: _execute() done 41684 1727204483.71956: dumping result to json 41684 1727204483.71958: done dumping result, returning 41684 1727204483.71968: done running TaskExecutor() for managed-node1/TASK: Assert interface0 is absent [0affcd87-79f5-3839-086d-0000000000b9] 41684 1727204483.71972: sending task result for task 0affcd87-79f5-3839-086d-0000000000b9 41684 1727204483.72057: done sending task result for task 0affcd87-79f5-3839-086d-0000000000b9 41684 1727204483.72060: WORKER PROCESS EXITING 41684 1727204483.72093: no more pending results, returning what we have 41684 1727204483.72098: in VariableManager get_vars() 41684 1727204483.72142: Calling all_inventory to load vars for managed-node1 41684 1727204483.72146: Calling groups_inventory to load vars for managed-node1 41684 1727204483.72148: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204483.72159: Calling all_plugins_play to load vars for managed-node1 41684 1727204483.72166: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204483.72170: Calling groups_plugins_play to load vars for managed-node1 41684 1727204483.72969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204483.73892: done with get_vars() 41684 1727204483.73905: variable 'ansible_search_path' from source: unknown 41684 1727204483.73917: we have included files to process 41684 1727204483.73918: generating all_blocks data 41684 1727204483.73920: done generating all_blocks data 41684 1727204483.73924: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 41684 1727204483.73924: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 41684 1727204483.73926: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 41684 1727204483.73996: in VariableManager get_vars() 41684 1727204483.74012: done with get_vars() 41684 1727204483.74092: done processing included file 41684 1727204483.74093: iterating over new_blocks loaded from include file 41684 1727204483.74094: in VariableManager get_vars() 41684 1727204483.74105: done with get_vars() 41684 1727204483.74106: filtering new block on tags 41684 1727204483.74129: done filtering new block on tags 41684 1727204483.74130: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node1 41684 1727204483.74134: extending task lists for all hosts with included blocks 41684 1727204483.74994: done extending task lists 41684 1727204483.74995: done processing included files 41684 1727204483.74996: results queue empty 41684 1727204483.74996: checking for any_errors_fatal 41684 1727204483.75000: done checking for any_errors_fatal 41684 1727204483.75000: checking for max_fail_percentage 41684 1727204483.75001: done checking for max_fail_percentage 41684 1727204483.75002: checking to see if all hosts have failed and the running result is not ok 41684 1727204483.75003: done checking to see if all hosts have failed 41684 1727204483.75003: getting the remaining hosts for this loop 41684 1727204483.75004: done getting the remaining hosts for this loop 41684 1727204483.75006: getting the next task for host managed-node1 41684 1727204483.75009: done getting next task for host managed-node1 41684 1727204483.75010: ^ task is: TASK: Include the task 'get_interface_stat.yml' 41684 1727204483.75012: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204483.75014: getting variables 41684 1727204483.75015: in VariableManager get_vars() 41684 1727204483.75024: Calling all_inventory to load vars for managed-node1 41684 1727204483.75025: Calling groups_inventory to load vars for managed-node1 41684 1727204483.75027: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204483.75030: Calling all_plugins_play to load vars for managed-node1 41684 1727204483.75032: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204483.75033: Calling groups_plugins_play to load vars for managed-node1 41684 1727204483.75763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204483.76665: done with get_vars() 41684 1727204483.76681: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:01:23 -0400 (0:00:00.057) 0:00:40.168 ***** 41684 1727204483.76736: entering _queue_task() for managed-node1/include_tasks 41684 1727204483.76976: worker is 1 (out of 1 available) 41684 1727204483.76989: exiting _queue_task() for managed-node1/include_tasks 41684 1727204483.77004: done queuing things up, now waiting for results queue to drain 41684 1727204483.77005: waiting for pending results... 41684 1727204483.77196: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 41684 1727204483.77276: in run() - task 0affcd87-79f5-3839-086d-000000000990 41684 1727204483.77287: variable 'ansible_search_path' from source: unknown 41684 1727204483.77291: variable 'ansible_search_path' from source: unknown 41684 1727204483.77320: calling self._execute() 41684 1727204483.77393: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204483.77397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204483.77406: variable 'omit' from source: magic vars 41684 1727204483.77680: variable 'ansible_distribution_major_version' from source: facts 41684 1727204483.77691: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204483.77697: _execute() done 41684 1727204483.77701: dumping result to json 41684 1727204483.77705: done dumping result, returning 41684 1727204483.77712: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-3839-086d-000000000990] 41684 1727204483.77717: sending task result for task 0affcd87-79f5-3839-086d-000000000990 41684 1727204483.77800: done sending task result for task 0affcd87-79f5-3839-086d-000000000990 41684 1727204483.77803: WORKER PROCESS EXITING 41684 1727204483.77837: no more pending results, returning what we have 41684 1727204483.77842: in VariableManager get_vars() 41684 1727204483.77889: Calling all_inventory to load vars for managed-node1 41684 1727204483.77892: Calling groups_inventory to load vars for managed-node1 41684 1727204483.77894: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204483.77906: Calling all_plugins_play to load vars for managed-node1 41684 1727204483.77908: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204483.77911: Calling groups_plugins_play to load vars for managed-node1 41684 1727204483.78718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204483.79727: done with get_vars() 41684 1727204483.79740: variable 'ansible_search_path' from source: unknown 41684 1727204483.79741: variable 'ansible_search_path' from source: unknown 41684 1727204483.79771: we have included files to process 41684 1727204483.79772: generating all_blocks data 41684 1727204483.79773: done generating all_blocks data 41684 1727204483.79773: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41684 1727204483.79774: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41684 1727204483.79776: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41684 1727204483.79900: done processing included file 41684 1727204483.79902: iterating over new_blocks loaded from include file 41684 1727204483.79903: in VariableManager get_vars() 41684 1727204483.79915: done with get_vars() 41684 1727204483.79916: filtering new block on tags 41684 1727204483.79931: done filtering new block on tags 41684 1727204483.79932: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 41684 1727204483.79936: extending task lists for all hosts with included blocks 41684 1727204483.80010: done extending task lists 41684 1727204483.80011: done processing included files 41684 1727204483.80012: results queue empty 41684 1727204483.80012: checking for any_errors_fatal 41684 1727204483.80015: done checking for any_errors_fatal 41684 1727204483.80015: checking for max_fail_percentage 41684 1727204483.80016: done checking for max_fail_percentage 41684 1727204483.80016: checking to see if all hosts have failed and the running result is not ok 41684 1727204483.80017: done checking to see if all hosts have failed 41684 1727204483.80017: getting the remaining hosts for this loop 41684 1727204483.80018: done getting the remaining hosts for this loop 41684 1727204483.80020: getting the next task for host managed-node1 41684 1727204483.80023: done getting next task for host managed-node1 41684 1727204483.80025: ^ task is: TASK: Get stat for interface {{ interface }} 41684 1727204483.80027: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204483.80028: getting variables 41684 1727204483.80029: in VariableManager get_vars() 41684 1727204483.80037: Calling all_inventory to load vars for managed-node1 41684 1727204483.80039: Calling groups_inventory to load vars for managed-node1 41684 1727204483.80041: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204483.80045: Calling all_plugins_play to load vars for managed-node1 41684 1727204483.80046: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204483.80048: Calling groups_plugins_play to load vars for managed-node1 41684 1727204483.80712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204483.81614: done with get_vars() 41684 1727204483.81630: done getting variables 41684 1727204483.81747: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:01:23 -0400 (0:00:00.050) 0:00:40.219 ***** 41684 1727204483.81772: entering _queue_task() for managed-node1/stat 41684 1727204483.82008: worker is 1 (out of 1 available) 41684 1727204483.82020: exiting _queue_task() for managed-node1/stat 41684 1727204483.82034: done queuing things up, now waiting for results queue to drain 41684 1727204483.82035: waiting for pending results... 41684 1727204483.82211: running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 41684 1727204483.82291: in run() - task 0affcd87-79f5-3839-086d-000000000a4d 41684 1727204483.82303: variable 'ansible_search_path' from source: unknown 41684 1727204483.82307: variable 'ansible_search_path' from source: unknown 41684 1727204483.82334: calling self._execute() 41684 1727204483.82409: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204483.82413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204483.82423: variable 'omit' from source: magic vars 41684 1727204483.82696: variable 'ansible_distribution_major_version' from source: facts 41684 1727204483.82708: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204483.82712: variable 'omit' from source: magic vars 41684 1727204483.82749: variable 'omit' from source: magic vars 41684 1727204483.82819: variable 'interface' from source: set_fact 41684 1727204483.82832: variable 'omit' from source: magic vars 41684 1727204483.82869: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204483.82896: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204483.82913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204483.82929: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204483.82938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204483.82961: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204483.82966: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204483.82971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204483.83040: Set connection var ansible_connection to ssh 41684 1727204483.83046: Set connection var ansible_pipelining to False 41684 1727204483.83051: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204483.83056: Set connection var ansible_timeout to 10 41684 1727204483.83063: Set connection var ansible_shell_executable to /bin/sh 41684 1727204483.83069: Set connection var ansible_shell_type to sh 41684 1727204483.83088: variable 'ansible_shell_executable' from source: unknown 41684 1727204483.83091: variable 'ansible_connection' from source: unknown 41684 1727204483.83094: variable 'ansible_module_compression' from source: unknown 41684 1727204483.83097: variable 'ansible_shell_type' from source: unknown 41684 1727204483.83099: variable 'ansible_shell_executable' from source: unknown 41684 1727204483.83101: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204483.83103: variable 'ansible_pipelining' from source: unknown 41684 1727204483.83105: variable 'ansible_timeout' from source: unknown 41684 1727204483.83110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204483.83256: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204483.83266: variable 'omit' from source: magic vars 41684 1727204483.83274: starting attempt loop 41684 1727204483.83276: running the handler 41684 1727204483.83288: _low_level_execute_command(): starting 41684 1727204483.83295: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204483.83822: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204483.83832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204483.83857: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204483.83874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204483.83886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.83935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204483.83942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204483.83956: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204483.84026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204483.85596: stdout chunk (state=3): >>>/root <<< 41684 1727204483.85738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204483.85759: stderr chunk (state=3): >>><<< 41684 1727204483.85762: stdout chunk (state=3): >>><<< 41684 1727204483.85787: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204483.85798: _low_level_execute_command(): starting 41684 1727204483.85803: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204483.857864-44584-238845702902697 `" && echo ansible-tmp-1727204483.857864-44584-238845702902697="` echo /root/.ansible/tmp/ansible-tmp-1727204483.857864-44584-238845702902697 `" ) && sleep 0' 41684 1727204483.86249: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204483.86255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204483.86294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204483.86316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.86356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204483.86370: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204483.86437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204483.88278: stdout chunk (state=3): >>>ansible-tmp-1727204483.857864-44584-238845702902697=/root/.ansible/tmp/ansible-tmp-1727204483.857864-44584-238845702902697 <<< 41684 1727204483.88389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204483.88446: stderr chunk (state=3): >>><<< 41684 1727204483.88450: stdout chunk (state=3): >>><<< 41684 1727204483.88469: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204483.857864-44584-238845702902697=/root/.ansible/tmp/ansible-tmp-1727204483.857864-44584-238845702902697 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204483.88507: variable 'ansible_module_compression' from source: unknown 41684 1727204483.88560: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 41684 1727204483.88591: variable 'ansible_facts' from source: unknown 41684 1727204483.88656: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204483.857864-44584-238845702902697/AnsiballZ_stat.py 41684 1727204483.88770: Sending initial data 41684 1727204483.88774: Sent initial data (152 bytes) 41684 1727204483.89468: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204483.89472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204483.89512: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.89516: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204483.89518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.89570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204483.89574: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204483.89584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204483.89636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204483.91321: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204483.91375: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204483.91430: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpttk4lgyn /root/.ansible/tmp/ansible-tmp-1727204483.857864-44584-238845702902697/AnsiballZ_stat.py <<< 41684 1727204483.91484: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204483.92317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204483.92426: stderr chunk (state=3): >>><<< 41684 1727204483.92431: stdout chunk (state=3): >>><<< 41684 1727204483.92450: done transferring module to remote 41684 1727204483.92458: _low_level_execute_command(): starting 41684 1727204483.92463: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204483.857864-44584-238845702902697/ /root/.ansible/tmp/ansible-tmp-1727204483.857864-44584-238845702902697/AnsiballZ_stat.py && sleep 0' 41684 1727204483.92924: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204483.92929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204483.92961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.92976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204483.92987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.93034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204483.93046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204483.93107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204483.94820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204483.94872: stderr chunk (state=3): >>><<< 41684 1727204483.94875: stdout chunk (state=3): >>><<< 41684 1727204483.94891: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204483.94893: _low_level_execute_command(): starting 41684 1727204483.94899: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204483.857864-44584-238845702902697/AnsiballZ_stat.py && sleep 0' 41684 1727204483.95349: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204483.95362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204483.95388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.95400: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204483.95445: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204483.95457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204483.95556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204484.08519: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 41684 1727204484.09499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204484.09503: stdout chunk (state=3): >>><<< 41684 1727204484.09505: stderr chunk (state=3): >>><<< 41684 1727204484.09636: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204484.09642: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204483.857864-44584-238845702902697/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204484.09646: _low_level_execute_command(): starting 41684 1727204484.09648: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204483.857864-44584-238845702902697/ > /dev/null 2>&1 && sleep 0' 41684 1727204484.10267: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204484.10289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204484.10311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204484.10331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204484.10378: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204484.10390: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204484.10409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204484.10432: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204484.10447: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204484.10459: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204484.10474: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204484.10489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204484.10505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204484.10515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204484.10528: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204484.10539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204484.10606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204484.10638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204484.10653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204484.10755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204484.12576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204484.12580: stdout chunk (state=3): >>><<< 41684 1727204484.12582: stderr chunk (state=3): >>><<< 41684 1727204484.12761: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204484.12768: handler run complete 41684 1727204484.12773: attempt loop complete, returning result 41684 1727204484.12797: _execute() done 41684 1727204484.12800: dumping result to json 41684 1727204484.12802: done dumping result, returning 41684 1727204484.12804: done running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 [0affcd87-79f5-3839-086d-000000000a4d] 41684 1727204484.12806: sending task result for task 0affcd87-79f5-3839-086d-000000000a4d 41684 1727204484.12878: done sending task result for task 0affcd87-79f5-3839-086d-000000000a4d 41684 1727204484.12881: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 41684 1727204484.12954: no more pending results, returning what we have 41684 1727204484.12958: results queue empty 41684 1727204484.12959: checking for any_errors_fatal 41684 1727204484.12961: done checking for any_errors_fatal 41684 1727204484.12961: checking for max_fail_percentage 41684 1727204484.12963: done checking for max_fail_percentage 41684 1727204484.12965: checking to see if all hosts have failed and the running result is not ok 41684 1727204484.12966: done checking to see if all hosts have failed 41684 1727204484.12967: getting the remaining hosts for this loop 41684 1727204484.12968: done getting the remaining hosts for this loop 41684 1727204484.12972: getting the next task for host managed-node1 41684 1727204484.12979: done getting next task for host managed-node1 41684 1727204484.12981: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 41684 1727204484.12985: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204484.12990: getting variables 41684 1727204484.12992: in VariableManager get_vars() 41684 1727204484.13029: Calling all_inventory to load vars for managed-node1 41684 1727204484.13031: Calling groups_inventory to load vars for managed-node1 41684 1727204484.13034: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204484.13043: Calling all_plugins_play to load vars for managed-node1 41684 1727204484.13045: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204484.13047: Calling groups_plugins_play to load vars for managed-node1 41684 1727204484.13950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204484.15245: done with get_vars() 41684 1727204484.15272: done getting variables 41684 1727204484.15336: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204484.15459: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:01:24 -0400 (0:00:00.337) 0:00:40.556 ***** 41684 1727204484.15493: entering _queue_task() for managed-node1/assert 41684 1727204484.15832: worker is 1 (out of 1 available) 41684 1727204484.15846: exiting _queue_task() for managed-node1/assert 41684 1727204484.15860: done queuing things up, now waiting for results queue to drain 41684 1727204484.15865: waiting for pending results... 41684 1727204484.16065: running TaskExecutor() for managed-node1/TASK: Assert that the interface is absent - 'ethtest0' 41684 1727204484.16132: in run() - task 0affcd87-79f5-3839-086d-000000000991 41684 1727204484.16149: variable 'ansible_search_path' from source: unknown 41684 1727204484.16153: variable 'ansible_search_path' from source: unknown 41684 1727204484.16181: calling self._execute() 41684 1727204484.16259: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204484.16267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204484.16271: variable 'omit' from source: magic vars 41684 1727204484.16540: variable 'ansible_distribution_major_version' from source: facts 41684 1727204484.16554: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204484.16561: variable 'omit' from source: magic vars 41684 1727204484.16598: variable 'omit' from source: magic vars 41684 1727204484.16667: variable 'interface' from source: set_fact 41684 1727204484.16680: variable 'omit' from source: magic vars 41684 1727204484.16716: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204484.16744: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204484.16767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204484.16779: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204484.16789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204484.16815: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204484.16819: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204484.16822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204484.16892: Set connection var ansible_connection to ssh 41684 1727204484.16897: Set connection var ansible_pipelining to False 41684 1727204484.16903: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204484.16911: Set connection var ansible_timeout to 10 41684 1727204484.16921: Set connection var ansible_shell_executable to /bin/sh 41684 1727204484.16924: Set connection var ansible_shell_type to sh 41684 1727204484.16942: variable 'ansible_shell_executable' from source: unknown 41684 1727204484.16945: variable 'ansible_connection' from source: unknown 41684 1727204484.16947: variable 'ansible_module_compression' from source: unknown 41684 1727204484.16950: variable 'ansible_shell_type' from source: unknown 41684 1727204484.16952: variable 'ansible_shell_executable' from source: unknown 41684 1727204484.16958: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204484.16962: variable 'ansible_pipelining' from source: unknown 41684 1727204484.16971: variable 'ansible_timeout' from source: unknown 41684 1727204484.16974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204484.17076: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204484.17087: variable 'omit' from source: magic vars 41684 1727204484.17093: starting attempt loop 41684 1727204484.17095: running the handler 41684 1727204484.17197: variable 'interface_stat' from source: set_fact 41684 1727204484.17205: Evaluated conditional (not interface_stat.stat.exists): True 41684 1727204484.17211: handler run complete 41684 1727204484.17222: attempt loop complete, returning result 41684 1727204484.17226: _execute() done 41684 1727204484.17229: dumping result to json 41684 1727204484.17231: done dumping result, returning 41684 1727204484.17237: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is absent - 'ethtest0' [0affcd87-79f5-3839-086d-000000000991] 41684 1727204484.17248: sending task result for task 0affcd87-79f5-3839-086d-000000000991 41684 1727204484.17332: done sending task result for task 0affcd87-79f5-3839-086d-000000000991 41684 1727204484.17335: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 41684 1727204484.17392: no more pending results, returning what we have 41684 1727204484.17395: results queue empty 41684 1727204484.17396: checking for any_errors_fatal 41684 1727204484.17407: done checking for any_errors_fatal 41684 1727204484.17407: checking for max_fail_percentage 41684 1727204484.17409: done checking for max_fail_percentage 41684 1727204484.17410: checking to see if all hosts have failed and the running result is not ok 41684 1727204484.17410: done checking to see if all hosts have failed 41684 1727204484.17411: getting the remaining hosts for this loop 41684 1727204484.17413: done getting the remaining hosts for this loop 41684 1727204484.17417: getting the next task for host managed-node1 41684 1727204484.17425: done getting next task for host managed-node1 41684 1727204484.17429: ^ task is: TASK: Assert interface0 profile and interface1 profile are absent 41684 1727204484.17431: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204484.17436: getting variables 41684 1727204484.17438: in VariableManager get_vars() 41684 1727204484.17490: Calling all_inventory to load vars for managed-node1 41684 1727204484.17493: Calling groups_inventory to load vars for managed-node1 41684 1727204484.17495: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204484.17505: Calling all_plugins_play to load vars for managed-node1 41684 1727204484.17507: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204484.17509: Calling groups_plugins_play to load vars for managed-node1 41684 1727204484.18831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204484.20164: done with get_vars() 41684 1727204484.20184: done getting variables TASK [Assert interface0 profile and interface1 profile are absent] ************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:162 Tuesday 24 September 2024 15:01:24 -0400 (0:00:00.047) 0:00:40.604 ***** 41684 1727204484.20253: entering _queue_task() for managed-node1/include_tasks 41684 1727204484.20497: worker is 1 (out of 1 available) 41684 1727204484.20510: exiting _queue_task() for managed-node1/include_tasks 41684 1727204484.20524: done queuing things up, now waiting for results queue to drain 41684 1727204484.20525: waiting for pending results... 41684 1727204484.20766: running TaskExecutor() for managed-node1/TASK: Assert interface0 profile and interface1 profile are absent 41684 1727204484.20843: in run() - task 0affcd87-79f5-3839-086d-0000000000ba 41684 1727204484.20854: variable 'ansible_search_path' from source: unknown 41684 1727204484.20895: variable 'interface0' from source: play vars 41684 1727204484.21051: variable 'interface0' from source: play vars 41684 1727204484.21067: variable 'interface1' from source: play vars 41684 1727204484.21114: variable 'interface1' from source: play vars 41684 1727204484.21123: variable 'omit' from source: magic vars 41684 1727204484.21243: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204484.21249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204484.21266: variable 'omit' from source: magic vars 41684 1727204484.21478: variable 'ansible_distribution_major_version' from source: facts 41684 1727204484.21498: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204484.21541: variable 'item' from source: unknown 41684 1727204484.21614: variable 'item' from source: unknown 41684 1727204484.21834: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204484.21853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204484.21869: variable 'omit' from source: magic vars 41684 1727204484.22148: variable 'ansible_distribution_major_version' from source: facts 41684 1727204484.22159: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204484.22194: variable 'item' from source: unknown 41684 1727204484.22263: variable 'item' from source: unknown 41684 1727204484.22355: dumping result to json 41684 1727204484.22365: done dumping result, returning 41684 1727204484.22377: done running TaskExecutor() for managed-node1/TASK: Assert interface0 profile and interface1 profile are absent [0affcd87-79f5-3839-086d-0000000000ba] 41684 1727204484.22388: sending task result for task 0affcd87-79f5-3839-086d-0000000000ba 41684 1727204484.22485: no more pending results, returning what we have 41684 1727204484.22491: in VariableManager get_vars() 41684 1727204484.22543: Calling all_inventory to load vars for managed-node1 41684 1727204484.22546: Calling groups_inventory to load vars for managed-node1 41684 1727204484.22548: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204484.22564: Calling all_plugins_play to load vars for managed-node1 41684 1727204484.22568: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204484.22573: Calling groups_plugins_play to load vars for managed-node1 41684 1727204484.23783: done sending task result for task 0affcd87-79f5-3839-086d-0000000000ba 41684 1727204484.23786: WORKER PROCESS EXITING 41684 1727204484.24846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204484.25931: done with get_vars() 41684 1727204484.25948: variable 'ansible_search_path' from source: unknown 41684 1727204484.25961: variable 'ansible_search_path' from source: unknown 41684 1727204484.25969: we have included files to process 41684 1727204484.25969: generating all_blocks data 41684 1727204484.25971: done generating all_blocks data 41684 1727204484.25974: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 41684 1727204484.25975: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 41684 1727204484.25977: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 41684 1727204484.26092: in VariableManager get_vars() 41684 1727204484.26111: done with get_vars() 41684 1727204484.26190: done processing included file 41684 1727204484.26192: iterating over new_blocks loaded from include file 41684 1727204484.26194: in VariableManager get_vars() 41684 1727204484.26208: done with get_vars() 41684 1727204484.26209: filtering new block on tags 41684 1727204484.26230: done filtering new block on tags 41684 1727204484.26231: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node1 => (item=ethtest0) 41684 1727204484.26235: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 41684 1727204484.26236: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 41684 1727204484.26238: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 41684 1727204484.26289: in VariableManager get_vars() 41684 1727204484.26305: done with get_vars() 41684 1727204484.26365: done processing included file 41684 1727204484.26366: iterating over new_blocks loaded from include file 41684 1727204484.26367: in VariableManager get_vars() 41684 1727204484.26379: done with get_vars() 41684 1727204484.26380: filtering new block on tags 41684 1727204484.26398: done filtering new block on tags 41684 1727204484.26400: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node1 => (item=ethtest1) 41684 1727204484.26402: extending task lists for all hosts with included blocks 41684 1727204484.28980: done extending task lists 41684 1727204484.28982: done processing included files 41684 1727204484.28983: results queue empty 41684 1727204484.28984: checking for any_errors_fatal 41684 1727204484.28987: done checking for any_errors_fatal 41684 1727204484.28988: checking for max_fail_percentage 41684 1727204484.28989: done checking for max_fail_percentage 41684 1727204484.28990: checking to see if all hosts have failed and the running result is not ok 41684 1727204484.28990: done checking to see if all hosts have failed 41684 1727204484.28995: getting the remaining hosts for this loop 41684 1727204484.28997: done getting the remaining hosts for this loop 41684 1727204484.29001: getting the next task for host managed-node1 41684 1727204484.29006: done getting next task for host managed-node1 41684 1727204484.29009: ^ task is: TASK: Include the task 'get_profile_stat.yml' 41684 1727204484.29012: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204484.29015: getting variables 41684 1727204484.29016: in VariableManager get_vars() 41684 1727204484.29035: Calling all_inventory to load vars for managed-node1 41684 1727204484.29038: Calling groups_inventory to load vars for managed-node1 41684 1727204484.29046: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204484.29052: Calling all_plugins_play to load vars for managed-node1 41684 1727204484.29055: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204484.29057: Calling groups_plugins_play to load vars for managed-node1 41684 1727204484.30421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204484.37013: done with get_vars() 41684 1727204484.37048: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 15:01:24 -0400 (0:00:00.168) 0:00:40.772 ***** 41684 1727204484.37130: entering _queue_task() for managed-node1/include_tasks 41684 1727204484.37517: worker is 1 (out of 1 available) 41684 1727204484.37531: exiting _queue_task() for managed-node1/include_tasks 41684 1727204484.37544: done queuing things up, now waiting for results queue to drain 41684 1727204484.37545: waiting for pending results... 41684 1727204484.37843: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 41684 1727204484.37984: in run() - task 0affcd87-79f5-3839-086d-000000000a6c 41684 1727204484.38010: variable 'ansible_search_path' from source: unknown 41684 1727204484.38023: variable 'ansible_search_path' from source: unknown 41684 1727204484.38068: calling self._execute() 41684 1727204484.38177: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204484.38191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204484.38210: variable 'omit' from source: magic vars 41684 1727204484.38603: variable 'ansible_distribution_major_version' from source: facts 41684 1727204484.38613: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204484.38621: _execute() done 41684 1727204484.38624: dumping result to json 41684 1727204484.38626: done dumping result, returning 41684 1727204484.38632: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-3839-086d-000000000a6c] 41684 1727204484.38638: sending task result for task 0affcd87-79f5-3839-086d-000000000a6c 41684 1727204484.38742: done sending task result for task 0affcd87-79f5-3839-086d-000000000a6c 41684 1727204484.38745: WORKER PROCESS EXITING 41684 1727204484.38790: no more pending results, returning what we have 41684 1727204484.38795: in VariableManager get_vars() 41684 1727204484.38845: Calling all_inventory to load vars for managed-node1 41684 1727204484.38848: Calling groups_inventory to load vars for managed-node1 41684 1727204484.38850: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204484.38864: Calling all_plugins_play to load vars for managed-node1 41684 1727204484.38875: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204484.38879: Calling groups_plugins_play to load vars for managed-node1 41684 1727204484.39705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204484.41189: done with get_vars() 41684 1727204484.41216: variable 'ansible_search_path' from source: unknown 41684 1727204484.41217: variable 'ansible_search_path' from source: unknown 41684 1727204484.41268: we have included files to process 41684 1727204484.41270: generating all_blocks data 41684 1727204484.41271: done generating all_blocks data 41684 1727204484.41273: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 41684 1727204484.41274: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 41684 1727204484.41278: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 41684 1727204484.42084: done processing included file 41684 1727204484.42086: iterating over new_blocks loaded from include file 41684 1727204484.42087: in VariableManager get_vars() 41684 1727204484.42103: done with get_vars() 41684 1727204484.42104: filtering new block on tags 41684 1727204484.42201: done filtering new block on tags 41684 1727204484.42203: in VariableManager get_vars() 41684 1727204484.42217: done with get_vars() 41684 1727204484.42218: filtering new block on tags 41684 1727204484.42253: done filtering new block on tags 41684 1727204484.42254: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 41684 1727204484.42259: extending task lists for all hosts with included blocks 41684 1727204484.42338: done extending task lists 41684 1727204484.42339: done processing included files 41684 1727204484.42340: results queue empty 41684 1727204484.42341: checking for any_errors_fatal 41684 1727204484.42344: done checking for any_errors_fatal 41684 1727204484.42344: checking for max_fail_percentage 41684 1727204484.42345: done checking for max_fail_percentage 41684 1727204484.42345: checking to see if all hosts have failed and the running result is not ok 41684 1727204484.42346: done checking to see if all hosts have failed 41684 1727204484.42347: getting the remaining hosts for this loop 41684 1727204484.42348: done getting the remaining hosts for this loop 41684 1727204484.42349: getting the next task for host managed-node1 41684 1727204484.42352: done getting next task for host managed-node1 41684 1727204484.42354: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 41684 1727204484.42356: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204484.42358: getting variables 41684 1727204484.42358: in VariableManager get_vars() 41684 1727204484.42371: Calling all_inventory to load vars for managed-node1 41684 1727204484.42373: Calling groups_inventory to load vars for managed-node1 41684 1727204484.42374: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204484.42378: Calling all_plugins_play to load vars for managed-node1 41684 1727204484.42379: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204484.42381: Calling groups_plugins_play to load vars for managed-node1 41684 1727204484.43129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204484.45030: done with get_vars() 41684 1727204484.45065: done getting variables 41684 1727204484.45114: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:01:24 -0400 (0:00:00.080) 0:00:40.853 ***** 41684 1727204484.45153: entering _queue_task() for managed-node1/set_fact 41684 1727204484.45980: worker is 1 (out of 1 available) 41684 1727204484.45993: exiting _queue_task() for managed-node1/set_fact 41684 1727204484.46006: done queuing things up, now waiting for results queue to drain 41684 1727204484.46008: waiting for pending results... 41684 1727204484.46318: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 41684 1727204484.46409: in run() - task 0affcd87-79f5-3839-086d-000000000b3c 41684 1727204484.46418: variable 'ansible_search_path' from source: unknown 41684 1727204484.46422: variable 'ansible_search_path' from source: unknown 41684 1727204484.46457: calling self._execute() 41684 1727204484.46539: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204484.46543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204484.46552: variable 'omit' from source: magic vars 41684 1727204484.46846: variable 'ansible_distribution_major_version' from source: facts 41684 1727204484.46857: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204484.46867: variable 'omit' from source: magic vars 41684 1727204484.46906: variable 'omit' from source: magic vars 41684 1727204484.46929: variable 'omit' from source: magic vars 41684 1727204484.46966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204484.46994: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204484.47014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204484.47029: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204484.47038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204484.47066: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204484.47070: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204484.47073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204484.47141: Set connection var ansible_connection to ssh 41684 1727204484.47145: Set connection var ansible_pipelining to False 41684 1727204484.47151: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204484.47156: Set connection var ansible_timeout to 10 41684 1727204484.47167: Set connection var ansible_shell_executable to /bin/sh 41684 1727204484.47170: Set connection var ansible_shell_type to sh 41684 1727204484.47188: variable 'ansible_shell_executable' from source: unknown 41684 1727204484.47192: variable 'ansible_connection' from source: unknown 41684 1727204484.47195: variable 'ansible_module_compression' from source: unknown 41684 1727204484.47198: variable 'ansible_shell_type' from source: unknown 41684 1727204484.47202: variable 'ansible_shell_executable' from source: unknown 41684 1727204484.47205: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204484.47207: variable 'ansible_pipelining' from source: unknown 41684 1727204484.47209: variable 'ansible_timeout' from source: unknown 41684 1727204484.47211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204484.47314: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204484.47328: variable 'omit' from source: magic vars 41684 1727204484.47332: starting attempt loop 41684 1727204484.47334: running the handler 41684 1727204484.47342: handler run complete 41684 1727204484.47350: attempt loop complete, returning result 41684 1727204484.47353: _execute() done 41684 1727204484.47355: dumping result to json 41684 1727204484.47358: done dumping result, returning 41684 1727204484.47368: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-3839-086d-000000000b3c] 41684 1727204484.47372: sending task result for task 0affcd87-79f5-3839-086d-000000000b3c ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 41684 1727204484.47881: no more pending results, returning what we have 41684 1727204484.47885: results queue empty 41684 1727204484.47886: checking for any_errors_fatal 41684 1727204484.47888: done checking for any_errors_fatal 41684 1727204484.47888: checking for max_fail_percentage 41684 1727204484.47890: done checking for max_fail_percentage 41684 1727204484.47891: checking to see if all hosts have failed and the running result is not ok 41684 1727204484.47892: done checking to see if all hosts have failed 41684 1727204484.47892: getting the remaining hosts for this loop 41684 1727204484.47894: done getting the remaining hosts for this loop 41684 1727204484.47897: getting the next task for host managed-node1 41684 1727204484.47904: done getting next task for host managed-node1 41684 1727204484.47907: ^ task is: TASK: Stat profile file 41684 1727204484.47912: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204484.47915: getting variables 41684 1727204484.47917: in VariableManager get_vars() 41684 1727204484.47963: Calling all_inventory to load vars for managed-node1 41684 1727204484.47968: Calling groups_inventory to load vars for managed-node1 41684 1727204484.47971: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204484.47981: Calling all_plugins_play to load vars for managed-node1 41684 1727204484.47984: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204484.47986: Calling groups_plugins_play to load vars for managed-node1 41684 1727204484.49081: done sending task result for task 0affcd87-79f5-3839-086d-000000000b3c 41684 1727204484.49084: WORKER PROCESS EXITING 41684 1727204484.49437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204484.51687: done with get_vars() 41684 1727204484.51712: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:01:24 -0400 (0:00:00.074) 0:00:40.927 ***** 41684 1727204484.52606: entering _queue_task() for managed-node1/stat 41684 1727204484.53696: worker is 1 (out of 1 available) 41684 1727204484.53706: exiting _queue_task() for managed-node1/stat 41684 1727204484.53718: done queuing things up, now waiting for results queue to drain 41684 1727204484.53719: waiting for pending results... 41684 1727204484.53739: running TaskExecutor() for managed-node1/TASK: Stat profile file 41684 1727204484.53887: in run() - task 0affcd87-79f5-3839-086d-000000000b3d 41684 1727204484.53909: variable 'ansible_search_path' from source: unknown 41684 1727204484.53917: variable 'ansible_search_path' from source: unknown 41684 1727204484.53971: calling self._execute() 41684 1727204484.54089: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204484.54131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204484.54147: variable 'omit' from source: magic vars 41684 1727204484.54545: variable 'ansible_distribution_major_version' from source: facts 41684 1727204484.54569: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204484.54580: variable 'omit' from source: magic vars 41684 1727204484.54755: variable 'omit' from source: magic vars 41684 1727204484.54980: variable 'profile' from source: include params 41684 1727204484.54990: variable 'item' from source: include params 41684 1727204484.55427: variable 'item' from source: include params 41684 1727204484.55449: variable 'omit' from source: magic vars 41684 1727204484.55500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204484.55544: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204484.55645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204484.55689: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204484.55741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204484.55801: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204484.55843: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204484.55852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204484.56082: Set connection var ansible_connection to ssh 41684 1727204484.56094: Set connection var ansible_pipelining to False 41684 1727204484.56176: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204484.56187: Set connection var ansible_timeout to 10 41684 1727204484.56198: Set connection var ansible_shell_executable to /bin/sh 41684 1727204484.56205: Set connection var ansible_shell_type to sh 41684 1727204484.56234: variable 'ansible_shell_executable' from source: unknown 41684 1727204484.56277: variable 'ansible_connection' from source: unknown 41684 1727204484.56284: variable 'ansible_module_compression' from source: unknown 41684 1727204484.56291: variable 'ansible_shell_type' from source: unknown 41684 1727204484.56298: variable 'ansible_shell_executable' from source: unknown 41684 1727204484.56304: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204484.56311: variable 'ansible_pipelining' from source: unknown 41684 1727204484.56318: variable 'ansible_timeout' from source: unknown 41684 1727204484.56325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204484.56670: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204484.56686: variable 'omit' from source: magic vars 41684 1727204484.56697: starting attempt loop 41684 1727204484.56708: running the handler 41684 1727204484.56726: _low_level_execute_command(): starting 41684 1727204484.56739: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204484.58481: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204484.58499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204484.58514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204484.58533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204484.58582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204484.58596: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204484.58610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204484.58627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204484.58638: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204484.58648: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204484.58660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204484.58681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204484.58699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204484.58711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204484.58722: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204484.58735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204484.58820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204484.58844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204484.58859: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204484.58954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204484.60547: stdout chunk (state=3): >>>/root <<< 41684 1727204484.60744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204484.60747: stdout chunk (state=3): >>><<< 41684 1727204484.60750: stderr chunk (state=3): >>><<< 41684 1727204484.60878: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204484.60882: _low_level_execute_command(): starting 41684 1727204484.60885: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204484.6077738-44615-47406795304997 `" && echo ansible-tmp-1727204484.6077738-44615-47406795304997="` echo /root/.ansible/tmp/ansible-tmp-1727204484.6077738-44615-47406795304997 `" ) && sleep 0' 41684 1727204484.61500: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204484.61514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204484.61533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204484.61551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204484.61610: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204484.61624: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204484.61645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204484.61667: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204484.61680: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204484.61692: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204484.61704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204484.61717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204484.61733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204484.61750: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204484.61761: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204484.61780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204484.61860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204484.61881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204484.61895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204484.61987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204484.63832: stdout chunk (state=3): >>>ansible-tmp-1727204484.6077738-44615-47406795304997=/root/.ansible/tmp/ansible-tmp-1727204484.6077738-44615-47406795304997 <<< 41684 1727204484.63983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204484.64037: stderr chunk (state=3): >>><<< 41684 1727204484.64041: stdout chunk (state=3): >>><<< 41684 1727204484.64067: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204484.6077738-44615-47406795304997=/root/.ansible/tmp/ansible-tmp-1727204484.6077738-44615-47406795304997 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204484.64114: variable 'ansible_module_compression' from source: unknown 41684 1727204484.64180: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 41684 1727204484.64217: variable 'ansible_facts' from source: unknown 41684 1727204484.64303: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204484.6077738-44615-47406795304997/AnsiballZ_stat.py 41684 1727204484.64455: Sending initial data 41684 1727204484.64458: Sent initial data (152 bytes) 41684 1727204484.65413: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204484.65422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204484.65432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204484.65445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204484.65491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204484.65495: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204484.65506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204484.65518: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204484.65526: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204484.65533: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204484.65540: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204484.65549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204484.65566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204484.65572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204484.65579: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204484.65589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204484.65673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204484.65680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204484.65683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204484.65773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204484.67485: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204484.67527: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204484.67590: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmptoi862vp /root/.ansible/tmp/ansible-tmp-1727204484.6077738-44615-47406795304997/AnsiballZ_stat.py <<< 41684 1727204484.67638: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204484.68872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204484.68969: stderr chunk (state=3): >>><<< 41684 1727204484.68973: stdout chunk (state=3): >>><<< 41684 1727204484.69093: done transferring module to remote 41684 1727204484.69097: _low_level_execute_command(): starting 41684 1727204484.69104: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204484.6077738-44615-47406795304997/ /root/.ansible/tmp/ansible-tmp-1727204484.6077738-44615-47406795304997/AnsiballZ_stat.py && sleep 0' 41684 1727204484.69743: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204484.69765: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204484.69782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204484.69800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204484.69842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204484.69855: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204484.69877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204484.69894: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204484.69905: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204484.69915: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204484.69927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204484.69940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204484.69954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204484.69972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204484.69987: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204484.70004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204484.70086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204484.70109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204484.70127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204484.70218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204484.71925: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204484.72010: stderr chunk (state=3): >>><<< 41684 1727204484.72013: stdout chunk (state=3): >>><<< 41684 1727204484.72119: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204484.72123: _low_level_execute_command(): starting 41684 1727204484.72125: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204484.6077738-44615-47406795304997/AnsiballZ_stat.py && sleep 0' 41684 1727204484.72753: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204484.72766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204484.72775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204484.72789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204484.72836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204484.72845: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204484.72860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204484.72877: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204484.72884: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204484.72891: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204484.72899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204484.72919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204484.72922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204484.72940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204484.72943: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204484.72950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204484.73025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204484.73041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204484.73099: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204484.73149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204484.86307: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 41684 1727204484.87198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204484.87256: stderr chunk (state=3): >>><<< 41684 1727204484.87261: stdout chunk (state=3): >>><<< 41684 1727204484.87282: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204484.87306: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204484.6077738-44615-47406795304997/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204484.87314: _low_level_execute_command(): starting 41684 1727204484.87319: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204484.6077738-44615-47406795304997/ > /dev/null 2>&1 && sleep 0' 41684 1727204484.87814: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204484.87818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204484.87850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204484.87854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204484.87856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204484.87913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204484.87917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204484.87984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204484.89731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204484.89789: stderr chunk (state=3): >>><<< 41684 1727204484.89792: stdout chunk (state=3): >>><<< 41684 1727204484.89807: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204484.89813: handler run complete 41684 1727204484.89831: attempt loop complete, returning result 41684 1727204484.89834: _execute() done 41684 1727204484.89837: dumping result to json 41684 1727204484.89839: done dumping result, returning 41684 1727204484.89847: done running TaskExecutor() for managed-node1/TASK: Stat profile file [0affcd87-79f5-3839-086d-000000000b3d] 41684 1727204484.89853: sending task result for task 0affcd87-79f5-3839-086d-000000000b3d 41684 1727204484.89954: done sending task result for task 0affcd87-79f5-3839-086d-000000000b3d 41684 1727204484.89956: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 41684 1727204484.90022: no more pending results, returning what we have 41684 1727204484.90025: results queue empty 41684 1727204484.90026: checking for any_errors_fatal 41684 1727204484.90032: done checking for any_errors_fatal 41684 1727204484.90033: checking for max_fail_percentage 41684 1727204484.90034: done checking for max_fail_percentage 41684 1727204484.90035: checking to see if all hosts have failed and the running result is not ok 41684 1727204484.90036: done checking to see if all hosts have failed 41684 1727204484.90036: getting the remaining hosts for this loop 41684 1727204484.90038: done getting the remaining hosts for this loop 41684 1727204484.90043: getting the next task for host managed-node1 41684 1727204484.90051: done getting next task for host managed-node1 41684 1727204484.90053: ^ task is: TASK: Set NM profile exist flag based on the profile files 41684 1727204484.90059: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204484.90067: getting variables 41684 1727204484.90069: in VariableManager get_vars() 41684 1727204484.90115: Calling all_inventory to load vars for managed-node1 41684 1727204484.90118: Calling groups_inventory to load vars for managed-node1 41684 1727204484.90120: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204484.90131: Calling all_plugins_play to load vars for managed-node1 41684 1727204484.90133: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204484.90135: Calling groups_plugins_play to load vars for managed-node1 41684 1727204484.91007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204484.91974: done with get_vars() 41684 1727204484.91993: done getting variables 41684 1727204484.92039: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:01:24 -0400 (0:00:00.394) 0:00:41.322 ***** 41684 1727204484.92073: entering _queue_task() for managed-node1/set_fact 41684 1727204484.92319: worker is 1 (out of 1 available) 41684 1727204484.92334: exiting _queue_task() for managed-node1/set_fact 41684 1727204484.92347: done queuing things up, now waiting for results queue to drain 41684 1727204484.92348: waiting for pending results... 41684 1727204484.92538: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 41684 1727204484.92626: in run() - task 0affcd87-79f5-3839-086d-000000000b3e 41684 1727204484.92637: variable 'ansible_search_path' from source: unknown 41684 1727204484.92640: variable 'ansible_search_path' from source: unknown 41684 1727204484.92675: calling self._execute() 41684 1727204484.92766: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204484.92772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204484.92784: variable 'omit' from source: magic vars 41684 1727204484.93072: variable 'ansible_distribution_major_version' from source: facts 41684 1727204484.93083: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204484.93172: variable 'profile_stat' from source: set_fact 41684 1727204484.93180: Evaluated conditional (profile_stat.stat.exists): False 41684 1727204484.93183: when evaluation is False, skipping this task 41684 1727204484.93186: _execute() done 41684 1727204484.93189: dumping result to json 41684 1727204484.93191: done dumping result, returning 41684 1727204484.93198: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-3839-086d-000000000b3e] 41684 1727204484.93204: sending task result for task 0affcd87-79f5-3839-086d-000000000b3e 41684 1727204484.93297: done sending task result for task 0affcd87-79f5-3839-086d-000000000b3e 41684 1727204484.93300: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41684 1727204484.93374: no more pending results, returning what we have 41684 1727204484.93381: results queue empty 41684 1727204484.93382: checking for any_errors_fatal 41684 1727204484.93389: done checking for any_errors_fatal 41684 1727204484.93390: checking for max_fail_percentage 41684 1727204484.93391: done checking for max_fail_percentage 41684 1727204484.93392: checking to see if all hosts have failed and the running result is not ok 41684 1727204484.93393: done checking to see if all hosts have failed 41684 1727204484.93394: getting the remaining hosts for this loop 41684 1727204484.93395: done getting the remaining hosts for this loop 41684 1727204484.93399: getting the next task for host managed-node1 41684 1727204484.93405: done getting next task for host managed-node1 41684 1727204484.93408: ^ task is: TASK: Get NM profile info 41684 1727204484.93413: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204484.93416: getting variables 41684 1727204484.93418: in VariableManager get_vars() 41684 1727204484.93458: Calling all_inventory to load vars for managed-node1 41684 1727204484.93461: Calling groups_inventory to load vars for managed-node1 41684 1727204484.93463: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204484.93476: Calling all_plugins_play to load vars for managed-node1 41684 1727204484.93478: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204484.93480: Calling groups_plugins_play to load vars for managed-node1 41684 1727204484.94444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204484.95369: done with get_vars() 41684 1727204484.95388: done getting variables 41684 1727204484.95459: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:01:24 -0400 (0:00:00.034) 0:00:41.356 ***** 41684 1727204484.95485: entering _queue_task() for managed-node1/shell 41684 1727204484.95487: Creating lock for shell 41684 1727204484.95731: worker is 1 (out of 1 available) 41684 1727204484.95746: exiting _queue_task() for managed-node1/shell 41684 1727204484.95758: done queuing things up, now waiting for results queue to drain 41684 1727204484.95760: waiting for pending results... 41684 1727204484.95943: running TaskExecutor() for managed-node1/TASK: Get NM profile info 41684 1727204484.96030: in run() - task 0affcd87-79f5-3839-086d-000000000b3f 41684 1727204484.96042: variable 'ansible_search_path' from source: unknown 41684 1727204484.96046: variable 'ansible_search_path' from source: unknown 41684 1727204484.96080: calling self._execute() 41684 1727204484.96155: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204484.96159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204484.96173: variable 'omit' from source: magic vars 41684 1727204484.96455: variable 'ansible_distribution_major_version' from source: facts 41684 1727204484.96471: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204484.96474: variable 'omit' from source: magic vars 41684 1727204484.96509: variable 'omit' from source: magic vars 41684 1727204484.96581: variable 'profile' from source: include params 41684 1727204484.96585: variable 'item' from source: include params 41684 1727204484.96639: variable 'item' from source: include params 41684 1727204484.96653: variable 'omit' from source: magic vars 41684 1727204484.96691: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204484.96719: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204484.96740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204484.96751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204484.96760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204484.96788: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204484.96793: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204484.96795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204484.96863: Set connection var ansible_connection to ssh 41684 1727204484.96873: Set connection var ansible_pipelining to False 41684 1727204484.96878: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204484.96883: Set connection var ansible_timeout to 10 41684 1727204484.96889: Set connection var ansible_shell_executable to /bin/sh 41684 1727204484.96893: Set connection var ansible_shell_type to sh 41684 1727204484.96911: variable 'ansible_shell_executable' from source: unknown 41684 1727204484.96914: variable 'ansible_connection' from source: unknown 41684 1727204484.96917: variable 'ansible_module_compression' from source: unknown 41684 1727204484.96919: variable 'ansible_shell_type' from source: unknown 41684 1727204484.96921: variable 'ansible_shell_executable' from source: unknown 41684 1727204484.96924: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204484.96926: variable 'ansible_pipelining' from source: unknown 41684 1727204484.96929: variable 'ansible_timeout' from source: unknown 41684 1727204484.96933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204484.97034: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204484.97043: variable 'omit' from source: magic vars 41684 1727204484.97047: starting attempt loop 41684 1727204484.97051: running the handler 41684 1727204484.97060: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204484.97080: _low_level_execute_command(): starting 41684 1727204484.97089: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204484.97622: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204484.97634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204484.97659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204484.97675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204484.97687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204484.97730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204484.97742: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204484.97818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204484.99355: stdout chunk (state=3): >>>/root <<< 41684 1727204484.99452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204484.99516: stderr chunk (state=3): >>><<< 41684 1727204484.99519: stdout chunk (state=3): >>><<< 41684 1727204484.99543: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204484.99553: _low_level_execute_command(): starting 41684 1727204484.99559: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204484.995408-44638-233808215681608 `" && echo ansible-tmp-1727204484.995408-44638-233808215681608="` echo /root/.ansible/tmp/ansible-tmp-1727204484.995408-44638-233808215681608 `" ) && sleep 0' 41684 1727204485.00029: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204485.00043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204485.00078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204485.00092: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204485.00103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204485.00150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204485.00162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204485.00229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204485.02051: stdout chunk (state=3): >>>ansible-tmp-1727204484.995408-44638-233808215681608=/root/.ansible/tmp/ansible-tmp-1727204484.995408-44638-233808215681608 <<< 41684 1727204485.02167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204485.02216: stderr chunk (state=3): >>><<< 41684 1727204485.02220: stdout chunk (state=3): >>><<< 41684 1727204485.02238: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204484.995408-44638-233808215681608=/root/.ansible/tmp/ansible-tmp-1727204484.995408-44638-233808215681608 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204485.02268: variable 'ansible_module_compression' from source: unknown 41684 1727204485.02313: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204485.02344: variable 'ansible_facts' from source: unknown 41684 1727204485.02409: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204484.995408-44638-233808215681608/AnsiballZ_command.py 41684 1727204485.02524: Sending initial data 41684 1727204485.02527: Sent initial data (155 bytes) 41684 1727204485.03510: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204485.03525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204485.03590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204485.05297: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204485.05336: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 41684 1727204485.05343: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 41684 1727204485.05349: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 41684 1727204485.05415: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpdmj6w_rc /root/.ansible/tmp/ansible-tmp-1727204484.995408-44638-233808215681608/AnsiballZ_command.py <<< 41684 1727204485.05480: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204485.06670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204485.06760: stderr chunk (state=3): >>><<< 41684 1727204485.06768: stdout chunk (state=3): >>><<< 41684 1727204485.06788: done transferring module to remote 41684 1727204485.06804: _low_level_execute_command(): starting 41684 1727204485.06807: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204484.995408-44638-233808215681608/ /root/.ansible/tmp/ansible-tmp-1727204484.995408-44638-233808215681608/AnsiballZ_command.py && sleep 0' 41684 1727204485.07478: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204485.07487: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204485.07498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204485.07512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204485.07550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204485.07561: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204485.07575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204485.07588: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204485.07597: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204485.07603: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204485.07611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204485.07621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204485.07632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204485.07641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204485.07647: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204485.07656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204485.07740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204485.07761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204485.07773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204485.07859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204485.09539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204485.09587: stderr chunk (state=3): >>><<< 41684 1727204485.09590: stdout chunk (state=3): >>><<< 41684 1727204485.09605: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204485.09608: _low_level_execute_command(): starting 41684 1727204485.09614: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204484.995408-44638-233808215681608/AnsiballZ_command.py && sleep 0' 41684 1727204485.10067: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204485.10071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204485.10116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204485.10119: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204485.10121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204485.10200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204485.10203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204485.10302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204485.25170: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 15:01:25.232212", "end": "2024-09-24 15:01:25.250835", "delta": "0:00:00.018623", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204485.26343: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.148 closed. <<< 41684 1727204485.26347: stdout chunk (state=3): >>><<< 41684 1727204485.26350: stderr chunk (state=3): >>><<< 41684 1727204485.26513: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 15:01:25.232212", "end": "2024-09-24 15:01:25.250835", "delta": "0:00:00.018623", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.148 closed. 41684 1727204485.26518: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204484.995408-44638-233808215681608/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204485.26521: _low_level_execute_command(): starting 41684 1727204485.26524: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204484.995408-44638-233808215681608/ > /dev/null 2>&1 && sleep 0' 41684 1727204485.27532: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204485.27541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204485.27553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204485.27566: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204485.27575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204485.27588: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204485.27595: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204485.27602: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204485.27610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204485.27619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204485.27630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204485.27638: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204485.27645: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204485.27657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204485.27728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204485.27750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204485.27771: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204485.27851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204485.29690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204485.29693: stdout chunk (state=3): >>><<< 41684 1727204485.29695: stderr chunk (state=3): >>><<< 41684 1727204485.29871: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204485.29875: handler run complete 41684 1727204485.29877: Evaluated conditional (False): False 41684 1727204485.29880: attempt loop complete, returning result 41684 1727204485.29882: _execute() done 41684 1727204485.29884: dumping result to json 41684 1727204485.29886: done dumping result, returning 41684 1727204485.29888: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [0affcd87-79f5-3839-086d-000000000b3f] 41684 1727204485.29890: sending task result for task 0affcd87-79f5-3839-086d-000000000b3f 41684 1727204485.29970: done sending task result for task 0affcd87-79f5-3839-086d-000000000b3f 41684 1727204485.29973: WORKER PROCESS EXITING fatal: [managed-node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.018623", "end": "2024-09-24 15:01:25.250835", "rc": 1, "start": "2024-09-24 15:01:25.232212" } MSG: non-zero return code ...ignoring 41684 1727204485.30078: no more pending results, returning what we have 41684 1727204485.30083: results queue empty 41684 1727204485.30084: checking for any_errors_fatal 41684 1727204485.30091: done checking for any_errors_fatal 41684 1727204485.30092: checking for max_fail_percentage 41684 1727204485.30094: done checking for max_fail_percentage 41684 1727204485.30095: checking to see if all hosts have failed and the running result is not ok 41684 1727204485.30096: done checking to see if all hosts have failed 41684 1727204485.30097: getting the remaining hosts for this loop 41684 1727204485.30099: done getting the remaining hosts for this loop 41684 1727204485.30105: getting the next task for host managed-node1 41684 1727204485.30114: done getting next task for host managed-node1 41684 1727204485.30117: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 41684 1727204485.30123: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204485.30128: getting variables 41684 1727204485.30130: in VariableManager get_vars() 41684 1727204485.30301: Calling all_inventory to load vars for managed-node1 41684 1727204485.30304: Calling groups_inventory to load vars for managed-node1 41684 1727204485.30307: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204485.30319: Calling all_plugins_play to load vars for managed-node1 41684 1727204485.30322: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204485.30325: Calling groups_plugins_play to load vars for managed-node1 41684 1727204485.32135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204485.33966: done with get_vars() 41684 1727204485.33998: done getting variables 41684 1727204485.34069: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.386) 0:00:41.742 ***** 41684 1727204485.34104: entering _queue_task() for managed-node1/set_fact 41684 1727204485.34458: worker is 1 (out of 1 available) 41684 1727204485.34475: exiting _queue_task() for managed-node1/set_fact 41684 1727204485.34488: done queuing things up, now waiting for results queue to drain 41684 1727204485.34490: waiting for pending results... 41684 1727204485.34802: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 41684 1727204485.34954: in run() - task 0affcd87-79f5-3839-086d-000000000b40 41684 1727204485.34979: variable 'ansible_search_path' from source: unknown 41684 1727204485.34987: variable 'ansible_search_path' from source: unknown 41684 1727204485.35031: calling self._execute() 41684 1727204485.35253: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204485.35270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204485.35287: variable 'omit' from source: magic vars 41684 1727204485.35688: variable 'ansible_distribution_major_version' from source: facts 41684 1727204485.35705: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204485.35847: variable 'nm_profile_exists' from source: set_fact 41684 1727204485.35875: Evaluated conditional (nm_profile_exists.rc == 0): False 41684 1727204485.35884: when evaluation is False, skipping this task 41684 1727204485.35891: _execute() done 41684 1727204485.35899: dumping result to json 41684 1727204485.35906: done dumping result, returning 41684 1727204485.35915: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-3839-086d-000000000b40] 41684 1727204485.35925: sending task result for task 0affcd87-79f5-3839-086d-000000000b40 skipping: [managed-node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 41684 1727204485.36083: no more pending results, returning what we have 41684 1727204485.36088: results queue empty 41684 1727204485.36089: checking for any_errors_fatal 41684 1727204485.36098: done checking for any_errors_fatal 41684 1727204485.36099: checking for max_fail_percentage 41684 1727204485.36100: done checking for max_fail_percentage 41684 1727204485.36101: checking to see if all hosts have failed and the running result is not ok 41684 1727204485.36102: done checking to see if all hosts have failed 41684 1727204485.36103: getting the remaining hosts for this loop 41684 1727204485.36105: done getting the remaining hosts for this loop 41684 1727204485.36110: getting the next task for host managed-node1 41684 1727204485.36122: done getting next task for host managed-node1 41684 1727204485.36125: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 41684 1727204485.36132: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204485.36136: getting variables 41684 1727204485.36138: in VariableManager get_vars() 41684 1727204485.36189: Calling all_inventory to load vars for managed-node1 41684 1727204485.36193: Calling groups_inventory to load vars for managed-node1 41684 1727204485.36195: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204485.36209: Calling all_plugins_play to load vars for managed-node1 41684 1727204485.36211: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204485.36214: Calling groups_plugins_play to load vars for managed-node1 41684 1727204485.37231: done sending task result for task 0affcd87-79f5-3839-086d-000000000b40 41684 1727204485.37235: WORKER PROCESS EXITING 41684 1727204485.38269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204485.40023: done with get_vars() 41684 1727204485.40057: done getting variables 41684 1727204485.40123: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204485.40275: variable 'profile' from source: include params 41684 1727204485.40279: variable 'item' from source: include params 41684 1727204485.40352: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.062) 0:00:41.805 ***** 41684 1727204485.40395: entering _queue_task() for managed-node1/command 41684 1727204485.40788: worker is 1 (out of 1 available) 41684 1727204485.40811: exiting _queue_task() for managed-node1/command 41684 1727204485.40825: done queuing things up, now waiting for results queue to drain 41684 1727204485.40826: waiting for pending results... 41684 1727204485.41130: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-ethtest0 41684 1727204485.41279: in run() - task 0affcd87-79f5-3839-086d-000000000b42 41684 1727204485.41299: variable 'ansible_search_path' from source: unknown 41684 1727204485.41307: variable 'ansible_search_path' from source: unknown 41684 1727204485.41348: calling self._execute() 41684 1727204485.41466: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204485.41484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204485.41502: variable 'omit' from source: magic vars 41684 1727204485.41922: variable 'ansible_distribution_major_version' from source: facts 41684 1727204485.41942: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204485.42079: variable 'profile_stat' from source: set_fact 41684 1727204485.42094: Evaluated conditional (profile_stat.stat.exists): False 41684 1727204485.42102: when evaluation is False, skipping this task 41684 1727204485.42109: _execute() done 41684 1727204485.42118: dumping result to json 41684 1727204485.42129: done dumping result, returning 41684 1727204485.42140: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [0affcd87-79f5-3839-086d-000000000b42] 41684 1727204485.42153: sending task result for task 0affcd87-79f5-3839-086d-000000000b42 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41684 1727204485.42316: no more pending results, returning what we have 41684 1727204485.42321: results queue empty 41684 1727204485.42322: checking for any_errors_fatal 41684 1727204485.42333: done checking for any_errors_fatal 41684 1727204485.42334: checking for max_fail_percentage 41684 1727204485.42335: done checking for max_fail_percentage 41684 1727204485.42336: checking to see if all hosts have failed and the running result is not ok 41684 1727204485.42337: done checking to see if all hosts have failed 41684 1727204485.42338: getting the remaining hosts for this loop 41684 1727204485.42340: done getting the remaining hosts for this loop 41684 1727204485.42344: getting the next task for host managed-node1 41684 1727204485.42354: done getting next task for host managed-node1 41684 1727204485.42356: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 41684 1727204485.42366: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204485.42370: getting variables 41684 1727204485.42372: in VariableManager get_vars() 41684 1727204485.42417: Calling all_inventory to load vars for managed-node1 41684 1727204485.42420: Calling groups_inventory to load vars for managed-node1 41684 1727204485.42422: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204485.42437: Calling all_plugins_play to load vars for managed-node1 41684 1727204485.42440: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204485.42444: Calling groups_plugins_play to load vars for managed-node1 41684 1727204485.43983: done sending task result for task 0affcd87-79f5-3839-086d-000000000b42 41684 1727204485.43986: WORKER PROCESS EXITING 41684 1727204485.44179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204485.45841: done with get_vars() 41684 1727204485.45870: done getting variables 41684 1727204485.45928: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204485.46046: variable 'profile' from source: include params 41684 1727204485.46050: variable 'item' from source: include params 41684 1727204485.46116: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.057) 0:00:41.863 ***** 41684 1727204485.46149: entering _queue_task() for managed-node1/set_fact 41684 1727204485.46456: worker is 1 (out of 1 available) 41684 1727204485.46474: exiting _queue_task() for managed-node1/set_fact 41684 1727204485.46488: done queuing things up, now waiting for results queue to drain 41684 1727204485.46489: waiting for pending results... 41684 1727204485.46773: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 41684 1727204485.46891: in run() - task 0affcd87-79f5-3839-086d-000000000b43 41684 1727204485.46911: variable 'ansible_search_path' from source: unknown 41684 1727204485.46918: variable 'ansible_search_path' from source: unknown 41684 1727204485.46968: calling self._execute() 41684 1727204485.47081: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204485.47093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204485.47107: variable 'omit' from source: magic vars 41684 1727204485.47482: variable 'ansible_distribution_major_version' from source: facts 41684 1727204485.47500: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204485.47626: variable 'profile_stat' from source: set_fact 41684 1727204485.47642: Evaluated conditional (profile_stat.stat.exists): False 41684 1727204485.47649: when evaluation is False, skipping this task 41684 1727204485.47656: _execute() done 41684 1727204485.47665: dumping result to json 41684 1727204485.47673: done dumping result, returning 41684 1727204485.47682: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [0affcd87-79f5-3839-086d-000000000b43] 41684 1727204485.47693: sending task result for task 0affcd87-79f5-3839-086d-000000000b43 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41684 1727204485.47850: no more pending results, returning what we have 41684 1727204485.47854: results queue empty 41684 1727204485.47855: checking for any_errors_fatal 41684 1727204485.47867: done checking for any_errors_fatal 41684 1727204485.47868: checking for max_fail_percentage 41684 1727204485.47871: done checking for max_fail_percentage 41684 1727204485.47871: checking to see if all hosts have failed and the running result is not ok 41684 1727204485.47873: done checking to see if all hosts have failed 41684 1727204485.47873: getting the remaining hosts for this loop 41684 1727204485.47875: done getting the remaining hosts for this loop 41684 1727204485.47880: getting the next task for host managed-node1 41684 1727204485.47889: done getting next task for host managed-node1 41684 1727204485.47892: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 41684 1727204485.47898: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204485.47901: getting variables 41684 1727204485.47903: in VariableManager get_vars() 41684 1727204485.47950: Calling all_inventory to load vars for managed-node1 41684 1727204485.47953: Calling groups_inventory to load vars for managed-node1 41684 1727204485.47955: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204485.47974: Calling all_plugins_play to load vars for managed-node1 41684 1727204485.47977: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204485.47980: Calling groups_plugins_play to load vars for managed-node1 41684 1727204485.48983: done sending task result for task 0affcd87-79f5-3839-086d-000000000b43 41684 1727204485.48987: WORKER PROCESS EXITING 41684 1727204485.49886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204485.51639: done with get_vars() 41684 1727204485.51670: done getting variables 41684 1727204485.51729: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204485.51850: variable 'profile' from source: include params 41684 1727204485.51854: variable 'item' from source: include params 41684 1727204485.51918: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.057) 0:00:41.921 ***** 41684 1727204485.51950: entering _queue_task() for managed-node1/command 41684 1727204485.52272: worker is 1 (out of 1 available) 41684 1727204485.52287: exiting _queue_task() for managed-node1/command 41684 1727204485.52301: done queuing things up, now waiting for results queue to drain 41684 1727204485.52302: waiting for pending results... 41684 1727204485.52593: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-ethtest0 41684 1727204485.52729: in run() - task 0affcd87-79f5-3839-086d-000000000b44 41684 1727204485.52752: variable 'ansible_search_path' from source: unknown 41684 1727204485.52760: variable 'ansible_search_path' from source: unknown 41684 1727204485.52803: calling self._execute() 41684 1727204485.52909: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204485.52922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204485.52939: variable 'omit' from source: magic vars 41684 1727204485.53319: variable 'ansible_distribution_major_version' from source: facts 41684 1727204485.53336: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204485.53471: variable 'profile_stat' from source: set_fact 41684 1727204485.53488: Evaluated conditional (profile_stat.stat.exists): False 41684 1727204485.53495: when evaluation is False, skipping this task 41684 1727204485.53508: _execute() done 41684 1727204485.53515: dumping result to json 41684 1727204485.53521: done dumping result, returning 41684 1727204485.53530: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-ethtest0 [0affcd87-79f5-3839-086d-000000000b44] 41684 1727204485.53541: sending task result for task 0affcd87-79f5-3839-086d-000000000b44 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41684 1727204485.53698: no more pending results, returning what we have 41684 1727204485.53703: results queue empty 41684 1727204485.53704: checking for any_errors_fatal 41684 1727204485.53715: done checking for any_errors_fatal 41684 1727204485.53716: checking for max_fail_percentage 41684 1727204485.53718: done checking for max_fail_percentage 41684 1727204485.53719: checking to see if all hosts have failed and the running result is not ok 41684 1727204485.53720: done checking to see if all hosts have failed 41684 1727204485.53721: getting the remaining hosts for this loop 41684 1727204485.53723: done getting the remaining hosts for this loop 41684 1727204485.53727: getting the next task for host managed-node1 41684 1727204485.53736: done getting next task for host managed-node1 41684 1727204485.53738: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 41684 1727204485.53745: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204485.53748: getting variables 41684 1727204485.53750: in VariableManager get_vars() 41684 1727204485.53801: Calling all_inventory to load vars for managed-node1 41684 1727204485.53804: Calling groups_inventory to load vars for managed-node1 41684 1727204485.53806: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204485.53821: Calling all_plugins_play to load vars for managed-node1 41684 1727204485.53824: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204485.53827: Calling groups_plugins_play to load vars for managed-node1 41684 1727204485.54783: done sending task result for task 0affcd87-79f5-3839-086d-000000000b44 41684 1727204485.54787: WORKER PROCESS EXITING 41684 1727204485.55567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204485.57410: done with get_vars() 41684 1727204485.57433: done getting variables 41684 1727204485.57496: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204485.57608: variable 'profile' from source: include params 41684 1727204485.57612: variable 'item' from source: include params 41684 1727204485.57673: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.057) 0:00:41.978 ***** 41684 1727204485.57706: entering _queue_task() for managed-node1/set_fact 41684 1727204485.58013: worker is 1 (out of 1 available) 41684 1727204485.58025: exiting _queue_task() for managed-node1/set_fact 41684 1727204485.58039: done queuing things up, now waiting for results queue to drain 41684 1727204485.58041: waiting for pending results... 41684 1727204485.58331: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-ethtest0 41684 1727204485.58459: in run() - task 0affcd87-79f5-3839-086d-000000000b45 41684 1727204485.58487: variable 'ansible_search_path' from source: unknown 41684 1727204485.58498: variable 'ansible_search_path' from source: unknown 41684 1727204485.58540: calling self._execute() 41684 1727204485.58651: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204485.58666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204485.58684: variable 'omit' from source: magic vars 41684 1727204485.59059: variable 'ansible_distribution_major_version' from source: facts 41684 1727204485.59167: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204485.59302: variable 'profile_stat' from source: set_fact 41684 1727204485.59318: Evaluated conditional (profile_stat.stat.exists): False 41684 1727204485.59326: when evaluation is False, skipping this task 41684 1727204485.59334: _execute() done 41684 1727204485.59341: dumping result to json 41684 1727204485.59348: done dumping result, returning 41684 1727204485.59357: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [0affcd87-79f5-3839-086d-000000000b45] 41684 1727204485.59377: sending task result for task 0affcd87-79f5-3839-086d-000000000b45 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41684 1727204485.59526: no more pending results, returning what we have 41684 1727204485.59530: results queue empty 41684 1727204485.59531: checking for any_errors_fatal 41684 1727204485.59538: done checking for any_errors_fatal 41684 1727204485.59539: checking for max_fail_percentage 41684 1727204485.59540: done checking for max_fail_percentage 41684 1727204485.59541: checking to see if all hosts have failed and the running result is not ok 41684 1727204485.59542: done checking to see if all hosts have failed 41684 1727204485.59543: getting the remaining hosts for this loop 41684 1727204485.59545: done getting the remaining hosts for this loop 41684 1727204485.59549: getting the next task for host managed-node1 41684 1727204485.59561: done getting next task for host managed-node1 41684 1727204485.59569: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 41684 1727204485.59575: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204485.59581: getting variables 41684 1727204485.59582: in VariableManager get_vars() 41684 1727204485.59627: Calling all_inventory to load vars for managed-node1 41684 1727204485.59630: Calling groups_inventory to load vars for managed-node1 41684 1727204485.59632: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204485.59646: Calling all_plugins_play to load vars for managed-node1 41684 1727204485.59649: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204485.59651: Calling groups_plugins_play to load vars for managed-node1 41684 1727204485.60971: done sending task result for task 0affcd87-79f5-3839-086d-000000000b45 41684 1727204485.60975: WORKER PROCESS EXITING 41684 1727204485.62199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204485.64769: done with get_vars() 41684 1727204485.64801: done getting variables 41684 1727204485.64872: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204485.65002: variable 'profile' from source: include params 41684 1727204485.65006: variable 'item' from source: include params 41684 1727204485.65070: variable 'item' from source: include params TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.073) 0:00:42.052 ***** 41684 1727204485.65105: entering _queue_task() for managed-node1/assert 41684 1727204485.65448: worker is 1 (out of 1 available) 41684 1727204485.65459: exiting _queue_task() for managed-node1/assert 41684 1727204485.65478: done queuing things up, now waiting for results queue to drain 41684 1727204485.65479: waiting for pending results... 41684 1727204485.65784: running TaskExecutor() for managed-node1/TASK: Assert that the profile is absent - 'ethtest0' 41684 1727204485.65918: in run() - task 0affcd87-79f5-3839-086d-000000000a6d 41684 1727204485.65988: variable 'ansible_search_path' from source: unknown 41684 1727204485.65998: variable 'ansible_search_path' from source: unknown 41684 1727204485.66080: calling self._execute() 41684 1727204485.66251: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204485.66287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204485.66303: variable 'omit' from source: magic vars 41684 1727204485.66702: variable 'ansible_distribution_major_version' from source: facts 41684 1727204485.66721: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204485.66734: variable 'omit' from source: magic vars 41684 1727204485.66791: variable 'omit' from source: magic vars 41684 1727204485.66903: variable 'profile' from source: include params 41684 1727204485.66917: variable 'item' from source: include params 41684 1727204485.66988: variable 'item' from source: include params 41684 1727204485.67012: variable 'omit' from source: magic vars 41684 1727204485.67066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204485.67108: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204485.67140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204485.67161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204485.67182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204485.67217: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204485.67226: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204485.67235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204485.67346: Set connection var ansible_connection to ssh 41684 1727204485.67361: Set connection var ansible_pipelining to False 41684 1727204485.67376: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204485.67385: Set connection var ansible_timeout to 10 41684 1727204485.67396: Set connection var ansible_shell_executable to /bin/sh 41684 1727204485.67402: Set connection var ansible_shell_type to sh 41684 1727204485.67429: variable 'ansible_shell_executable' from source: unknown 41684 1727204485.67437: variable 'ansible_connection' from source: unknown 41684 1727204485.67443: variable 'ansible_module_compression' from source: unknown 41684 1727204485.67450: variable 'ansible_shell_type' from source: unknown 41684 1727204485.67461: variable 'ansible_shell_executable' from source: unknown 41684 1727204485.67473: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204485.67481: variable 'ansible_pipelining' from source: unknown 41684 1727204485.67489: variable 'ansible_timeout' from source: unknown 41684 1727204485.67495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204485.67643: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204485.67665: variable 'omit' from source: magic vars 41684 1727204485.67677: starting attempt loop 41684 1727204485.67684: running the handler 41684 1727204485.67810: variable 'lsr_net_profile_exists' from source: set_fact 41684 1727204485.67820: Evaluated conditional (not lsr_net_profile_exists): True 41684 1727204485.67829: handler run complete 41684 1727204485.67847: attempt loop complete, returning result 41684 1727204485.67854: _execute() done 41684 1727204485.67866: dumping result to json 41684 1727204485.67875: done dumping result, returning 41684 1727204485.67886: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is absent - 'ethtest0' [0affcd87-79f5-3839-086d-000000000a6d] 41684 1727204485.67899: sending task result for task 0affcd87-79f5-3839-086d-000000000a6d 41684 1727204485.68005: done sending task result for task 0affcd87-79f5-3839-086d-000000000a6d ok: [managed-node1] => { "changed": false } MSG: All assertions passed 41684 1727204485.68060: no more pending results, returning what we have 41684 1727204485.68069: results queue empty 41684 1727204485.68070: checking for any_errors_fatal 41684 1727204485.68081: done checking for any_errors_fatal 41684 1727204485.68082: checking for max_fail_percentage 41684 1727204485.68084: done checking for max_fail_percentage 41684 1727204485.68084: checking to see if all hosts have failed and the running result is not ok 41684 1727204485.68086: done checking to see if all hosts have failed 41684 1727204485.68086: getting the remaining hosts for this loop 41684 1727204485.68088: done getting the remaining hosts for this loop 41684 1727204485.68092: getting the next task for host managed-node1 41684 1727204485.68103: done getting next task for host managed-node1 41684 1727204485.68106: ^ task is: TASK: Include the task 'get_profile_stat.yml' 41684 1727204485.68110: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204485.68114: getting variables 41684 1727204485.68116: in VariableManager get_vars() 41684 1727204485.68160: Calling all_inventory to load vars for managed-node1 41684 1727204485.68168: Calling groups_inventory to load vars for managed-node1 41684 1727204485.68171: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204485.68184: Calling all_plugins_play to load vars for managed-node1 41684 1727204485.68187: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204485.68190: Calling groups_plugins_play to load vars for managed-node1 41684 1727204485.69373: WORKER PROCESS EXITING 41684 1727204485.70425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204485.72139: done with get_vars() 41684 1727204485.72170: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.071) 0:00:42.124 ***** 41684 1727204485.72270: entering _queue_task() for managed-node1/include_tasks 41684 1727204485.72613: worker is 1 (out of 1 available) 41684 1727204485.72625: exiting _queue_task() for managed-node1/include_tasks 41684 1727204485.72639: done queuing things up, now waiting for results queue to drain 41684 1727204485.72641: waiting for pending results... 41684 1727204485.72937: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 41684 1727204485.73081: in run() - task 0affcd87-79f5-3839-086d-000000000a71 41684 1727204485.73106: variable 'ansible_search_path' from source: unknown 41684 1727204485.73113: variable 'ansible_search_path' from source: unknown 41684 1727204485.73155: calling self._execute() 41684 1727204485.73275: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204485.73286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204485.73301: variable 'omit' from source: magic vars 41684 1727204485.73682: variable 'ansible_distribution_major_version' from source: facts 41684 1727204485.73699: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204485.73709: _execute() done 41684 1727204485.73717: dumping result to json 41684 1727204485.73723: done dumping result, returning 41684 1727204485.73732: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-3839-086d-000000000a71] 41684 1727204485.73746: sending task result for task 0affcd87-79f5-3839-086d-000000000a71 41684 1727204485.73880: no more pending results, returning what we have 41684 1727204485.73886: in VariableManager get_vars() 41684 1727204485.73938: Calling all_inventory to load vars for managed-node1 41684 1727204485.73941: Calling groups_inventory to load vars for managed-node1 41684 1727204485.73943: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204485.73958: Calling all_plugins_play to load vars for managed-node1 41684 1727204485.73961: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204485.73968: Calling groups_plugins_play to load vars for managed-node1 41684 1727204485.74983: done sending task result for task 0affcd87-79f5-3839-086d-000000000a71 41684 1727204485.74987: WORKER PROCESS EXITING 41684 1727204485.75730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204485.77497: done with get_vars() 41684 1727204485.77525: variable 'ansible_search_path' from source: unknown 41684 1727204485.77526: variable 'ansible_search_path' from source: unknown 41684 1727204485.77571: we have included files to process 41684 1727204485.77572: generating all_blocks data 41684 1727204485.77575: done generating all_blocks data 41684 1727204485.77580: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 41684 1727204485.77581: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 41684 1727204485.77584: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 41684 1727204485.79605: done processing included file 41684 1727204485.79608: iterating over new_blocks loaded from include file 41684 1727204485.79609: in VariableManager get_vars() 41684 1727204485.79633: done with get_vars() 41684 1727204485.79634: filtering new block on tags 41684 1727204485.79707: done filtering new block on tags 41684 1727204485.79711: in VariableManager get_vars() 41684 1727204485.79731: done with get_vars() 41684 1727204485.79733: filtering new block on tags 41684 1727204485.79790: done filtering new block on tags 41684 1727204485.79792: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 41684 1727204485.79798: extending task lists for all hosts with included blocks 41684 1727204485.79927: done extending task lists 41684 1727204485.79929: done processing included files 41684 1727204485.79930: results queue empty 41684 1727204485.79930: checking for any_errors_fatal 41684 1727204485.79934: done checking for any_errors_fatal 41684 1727204485.79935: checking for max_fail_percentage 41684 1727204485.79936: done checking for max_fail_percentage 41684 1727204485.79937: checking to see if all hosts have failed and the running result is not ok 41684 1727204485.79938: done checking to see if all hosts have failed 41684 1727204485.79938: getting the remaining hosts for this loop 41684 1727204485.79940: done getting the remaining hosts for this loop 41684 1727204485.79942: getting the next task for host managed-node1 41684 1727204485.79946: done getting next task for host managed-node1 41684 1727204485.79949: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 41684 1727204485.79952: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204485.79954: getting variables 41684 1727204485.79955: in VariableManager get_vars() 41684 1727204485.79972: Calling all_inventory to load vars for managed-node1 41684 1727204485.79974: Calling groups_inventory to load vars for managed-node1 41684 1727204485.79976: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204485.79982: Calling all_plugins_play to load vars for managed-node1 41684 1727204485.79984: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204485.79987: Calling groups_plugins_play to load vars for managed-node1 41684 1727204485.83528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204485.86793: done with get_vars() 41684 1727204485.86831: done getting variables 41684 1727204485.86890: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.146) 0:00:42.270 ***** 41684 1727204485.86925: entering _queue_task() for managed-node1/set_fact 41684 1727204485.87413: worker is 1 (out of 1 available) 41684 1727204485.87426: exiting _queue_task() for managed-node1/set_fact 41684 1727204485.87439: done queuing things up, now waiting for results queue to drain 41684 1727204485.87441: waiting for pending results... 41684 1727204485.88340: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 41684 1727204485.88641: in run() - task 0affcd87-79f5-3839-086d-000000000b79 41684 1727204485.89097: variable 'ansible_search_path' from source: unknown 41684 1727204485.89102: variable 'ansible_search_path' from source: unknown 41684 1727204485.89139: calling self._execute() 41684 1727204485.89368: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204485.89372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204485.89375: variable 'omit' from source: magic vars 41684 1727204485.90206: variable 'ansible_distribution_major_version' from source: facts 41684 1727204485.90222: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204485.90228: variable 'omit' from source: magic vars 41684 1727204485.90408: variable 'omit' from source: magic vars 41684 1727204485.90442: variable 'omit' from source: magic vars 41684 1727204485.90601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204485.90638: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204485.90667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204485.90683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204485.90695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204485.90839: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204485.90844: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204485.90846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204485.91068: Set connection var ansible_connection to ssh 41684 1727204485.91072: Set connection var ansible_pipelining to False 41684 1727204485.91079: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204485.91084: Set connection var ansible_timeout to 10 41684 1727204485.91092: Set connection var ansible_shell_executable to /bin/sh 41684 1727204485.91095: Set connection var ansible_shell_type to sh 41684 1727204485.91121: variable 'ansible_shell_executable' from source: unknown 41684 1727204485.91124: variable 'ansible_connection' from source: unknown 41684 1727204485.91127: variable 'ansible_module_compression' from source: unknown 41684 1727204485.91130: variable 'ansible_shell_type' from source: unknown 41684 1727204485.91132: variable 'ansible_shell_executable' from source: unknown 41684 1727204485.91134: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204485.91256: variable 'ansible_pipelining' from source: unknown 41684 1727204485.91259: variable 'ansible_timeout' from source: unknown 41684 1727204485.91265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204485.91514: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204485.91524: variable 'omit' from source: magic vars 41684 1727204485.91531: starting attempt loop 41684 1727204485.91534: running the handler 41684 1727204485.91546: handler run complete 41684 1727204485.91558: attempt loop complete, returning result 41684 1727204485.91565: _execute() done 41684 1727204485.91568: dumping result to json 41684 1727204485.91600: done dumping result, returning 41684 1727204485.91608: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-3839-086d-000000000b79] 41684 1727204485.91615: sending task result for task 0affcd87-79f5-3839-086d-000000000b79 41684 1727204485.91716: done sending task result for task 0affcd87-79f5-3839-086d-000000000b79 41684 1727204485.91720: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 41684 1727204485.91783: no more pending results, returning what we have 41684 1727204485.91788: results queue empty 41684 1727204485.91789: checking for any_errors_fatal 41684 1727204485.91791: done checking for any_errors_fatal 41684 1727204485.91792: checking for max_fail_percentage 41684 1727204485.91793: done checking for max_fail_percentage 41684 1727204485.91794: checking to see if all hosts have failed and the running result is not ok 41684 1727204485.91795: done checking to see if all hosts have failed 41684 1727204485.91796: getting the remaining hosts for this loop 41684 1727204485.91797: done getting the remaining hosts for this loop 41684 1727204485.91801: getting the next task for host managed-node1 41684 1727204485.91810: done getting next task for host managed-node1 41684 1727204485.91813: ^ task is: TASK: Stat profile file 41684 1727204485.91818: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204485.91822: getting variables 41684 1727204485.91823: in VariableManager get_vars() 41684 1727204485.91872: Calling all_inventory to load vars for managed-node1 41684 1727204485.91875: Calling groups_inventory to load vars for managed-node1 41684 1727204485.91877: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204485.91888: Calling all_plugins_play to load vars for managed-node1 41684 1727204485.91891: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204485.91893: Calling groups_plugins_play to load vars for managed-node1 41684 1727204485.94023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204485.98119: done with get_vars() 41684 1727204485.98159: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:01:25 -0400 (0:00:00.115) 0:00:42.386 ***** 41684 1727204485.98477: entering _queue_task() for managed-node1/stat 41684 1727204485.99339: worker is 1 (out of 1 available) 41684 1727204485.99352: exiting _queue_task() for managed-node1/stat 41684 1727204485.99370: done queuing things up, now waiting for results queue to drain 41684 1727204485.99371: waiting for pending results... 41684 1727204486.01074: running TaskExecutor() for managed-node1/TASK: Stat profile file 41684 1727204486.01184: in run() - task 0affcd87-79f5-3839-086d-000000000b7a 41684 1727204486.01198: variable 'ansible_search_path' from source: unknown 41684 1727204486.01202: variable 'ansible_search_path' from source: unknown 41684 1727204486.01242: calling self._execute() 41684 1727204486.01342: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204486.01346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204486.01358: variable 'omit' from source: magic vars 41684 1727204486.02828: variable 'ansible_distribution_major_version' from source: facts 41684 1727204486.02841: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204486.02849: variable 'omit' from source: magic vars 41684 1727204486.02909: variable 'omit' from source: magic vars 41684 1727204486.03119: variable 'profile' from source: include params 41684 1727204486.03123: variable 'item' from source: include params 41684 1727204486.03193: variable 'item' from source: include params 41684 1727204486.03214: variable 'omit' from source: magic vars 41684 1727204486.03259: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204486.03297: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204486.03322: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204486.03336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204486.03349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204486.03382: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204486.03385: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204486.03389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204486.03491: Set connection var ansible_connection to ssh 41684 1727204486.03498: Set connection var ansible_pipelining to False 41684 1727204486.03504: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204486.03510: Set connection var ansible_timeout to 10 41684 1727204486.03518: Set connection var ansible_shell_executable to /bin/sh 41684 1727204486.03521: Set connection var ansible_shell_type to sh 41684 1727204486.03549: variable 'ansible_shell_executable' from source: unknown 41684 1727204486.03552: variable 'ansible_connection' from source: unknown 41684 1727204486.03555: variable 'ansible_module_compression' from source: unknown 41684 1727204486.03560: variable 'ansible_shell_type' from source: unknown 41684 1727204486.03566: variable 'ansible_shell_executable' from source: unknown 41684 1727204486.03568: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204486.03570: variable 'ansible_pipelining' from source: unknown 41684 1727204486.04377: variable 'ansible_timeout' from source: unknown 41684 1727204486.04380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204486.04592: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41684 1727204486.04604: variable 'omit' from source: magic vars 41684 1727204486.04611: starting attempt loop 41684 1727204486.04614: running the handler 41684 1727204486.04630: _low_level_execute_command(): starting 41684 1727204486.04639: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204486.06072: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204486.06785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.06796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.06812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.06857: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204486.06868: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204486.06876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.06890: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204486.06898: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204486.06904: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204486.06912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.06922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.06934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.06942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204486.06950: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204486.06959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.07039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204486.07055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204486.07058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204486.07150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204486.08798: stdout chunk (state=3): >>>/root <<< 41684 1727204486.08981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204486.08984: stdout chunk (state=3): >>><<< 41684 1727204486.08995: stderr chunk (state=3): >>><<< 41684 1727204486.09020: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204486.09036: _low_level_execute_command(): starting 41684 1727204486.09043: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204486.090216-44673-177157444467401 `" && echo ansible-tmp-1727204486.090216-44673-177157444467401="` echo /root/.ansible/tmp/ansible-tmp-1727204486.090216-44673-177157444467401 `" ) && sleep 0' 41684 1727204486.10100: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204486.10116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.10134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.10151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.10203: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204486.10215: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204486.10233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.10251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204486.10266: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204486.10282: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204486.10294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.10307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.10322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.10334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204486.10351: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204486.10373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.10459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204486.10481: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204486.10500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204486.10692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204486.12498: stdout chunk (state=3): >>>ansible-tmp-1727204486.090216-44673-177157444467401=/root/.ansible/tmp/ansible-tmp-1727204486.090216-44673-177157444467401 <<< 41684 1727204486.12702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204486.12705: stdout chunk (state=3): >>><<< 41684 1727204486.12708: stderr chunk (state=3): >>><<< 41684 1727204486.12973: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204486.090216-44673-177157444467401=/root/.ansible/tmp/ansible-tmp-1727204486.090216-44673-177157444467401 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204486.12976: variable 'ansible_module_compression' from source: unknown 41684 1727204486.12979: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 41684 1727204486.12981: variable 'ansible_facts' from source: unknown 41684 1727204486.12984: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204486.090216-44673-177157444467401/AnsiballZ_stat.py 41684 1727204486.13821: Sending initial data 41684 1727204486.13824: Sent initial data (152 bytes) 41684 1727204486.15090: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.15094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.15136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.15139: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.15142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.15212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204486.15215: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204486.15217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204486.15291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204486.16986: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204486.17033: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204486.17091: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpg3b4x8lg /root/.ansible/tmp/ansible-tmp-1727204486.090216-44673-177157444467401/AnsiballZ_stat.py <<< 41684 1727204486.17142: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204486.18568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204486.18827: stderr chunk (state=3): >>><<< 41684 1727204486.18830: stdout chunk (state=3): >>><<< 41684 1727204486.18833: done transferring module to remote 41684 1727204486.18835: _low_level_execute_command(): starting 41684 1727204486.18838: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204486.090216-44673-177157444467401/ /root/.ansible/tmp/ansible-tmp-1727204486.090216-44673-177157444467401/AnsiballZ_stat.py && sleep 0' 41684 1727204486.19917: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204486.19937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.19973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.19998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.20043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204486.20071: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204486.20088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.20106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204486.20119: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204486.20131: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204486.20150: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.20182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.20200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.20219: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204486.20237: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204486.20252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.20344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204486.20361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204486.20385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204486.20477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204486.22241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204486.22244: stdout chunk (state=3): >>><<< 41684 1727204486.22266: stderr chunk (state=3): >>><<< 41684 1727204486.22270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204486.22275: _low_level_execute_command(): starting 41684 1727204486.22281: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204486.090216-44673-177157444467401/AnsiballZ_stat.py && sleep 0' 41684 1727204486.23315: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.23322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.23369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.23375: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.23388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.23395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.23476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204486.23490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204486.23499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204486.23590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204486.36737: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 41684 1727204486.37772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204486.37777: stdout chunk (state=3): >>><<< 41684 1727204486.37779: stderr chunk (state=3): >>><<< 41684 1727204486.37928: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204486.37933: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204486.090216-44673-177157444467401/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204486.37940: _low_level_execute_command(): starting 41684 1727204486.37943: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204486.090216-44673-177157444467401/ > /dev/null 2>&1 && sleep 0' 41684 1727204486.38589: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204486.38609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.38625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.38645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.38692: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204486.38712: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204486.38728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.38747: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204486.38759: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204486.38775: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204486.38788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.38803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.38825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.38837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204486.38850: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204486.38867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.38949: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204486.38974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204486.38992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204486.39089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204486.40888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204486.40972: stderr chunk (state=3): >>><<< 41684 1727204486.40988: stdout chunk (state=3): >>><<< 41684 1727204486.41176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204486.41180: handler run complete 41684 1727204486.41182: attempt loop complete, returning result 41684 1727204486.41184: _execute() done 41684 1727204486.41186: dumping result to json 41684 1727204486.41189: done dumping result, returning 41684 1727204486.41190: done running TaskExecutor() for managed-node1/TASK: Stat profile file [0affcd87-79f5-3839-086d-000000000b7a] 41684 1727204486.41192: sending task result for task 0affcd87-79f5-3839-086d-000000000b7a 41684 1727204486.41278: done sending task result for task 0affcd87-79f5-3839-086d-000000000b7a 41684 1727204486.41281: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 41684 1727204486.41343: no more pending results, returning what we have 41684 1727204486.41347: results queue empty 41684 1727204486.41348: checking for any_errors_fatal 41684 1727204486.41358: done checking for any_errors_fatal 41684 1727204486.41359: checking for max_fail_percentage 41684 1727204486.41361: done checking for max_fail_percentage 41684 1727204486.41366: checking to see if all hosts have failed and the running result is not ok 41684 1727204486.41367: done checking to see if all hosts have failed 41684 1727204486.41368: getting the remaining hosts for this loop 41684 1727204486.41370: done getting the remaining hosts for this loop 41684 1727204486.41375: getting the next task for host managed-node1 41684 1727204486.41383: done getting next task for host managed-node1 41684 1727204486.41385: ^ task is: TASK: Set NM profile exist flag based on the profile files 41684 1727204486.41391: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204486.41396: getting variables 41684 1727204486.41398: in VariableManager get_vars() 41684 1727204486.41444: Calling all_inventory to load vars for managed-node1 41684 1727204486.41447: Calling groups_inventory to load vars for managed-node1 41684 1727204486.41450: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204486.41467: Calling all_plugins_play to load vars for managed-node1 41684 1727204486.41470: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204486.41474: Calling groups_plugins_play to load vars for managed-node1 41684 1727204486.43789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204486.45492: done with get_vars() 41684 1727204486.45520: done getting variables 41684 1727204486.45585: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:01:26 -0400 (0:00:00.471) 0:00:42.857 ***** 41684 1727204486.45620: entering _queue_task() for managed-node1/set_fact 41684 1727204486.45959: worker is 1 (out of 1 available) 41684 1727204486.45976: exiting _queue_task() for managed-node1/set_fact 41684 1727204486.45989: done queuing things up, now waiting for results queue to drain 41684 1727204486.45990: waiting for pending results... 41684 1727204486.46275: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 41684 1727204486.46394: in run() - task 0affcd87-79f5-3839-086d-000000000b7b 41684 1727204486.46407: variable 'ansible_search_path' from source: unknown 41684 1727204486.46411: variable 'ansible_search_path' from source: unknown 41684 1727204486.46451: calling self._execute() 41684 1727204486.46553: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204486.46556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204486.46570: variable 'omit' from source: magic vars 41684 1727204486.46921: variable 'ansible_distribution_major_version' from source: facts 41684 1727204486.46933: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204486.47060: variable 'profile_stat' from source: set_fact 41684 1727204486.47073: Evaluated conditional (profile_stat.stat.exists): False 41684 1727204486.47080: when evaluation is False, skipping this task 41684 1727204486.47083: _execute() done 41684 1727204486.47085: dumping result to json 41684 1727204486.47088: done dumping result, returning 41684 1727204486.47095: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-3839-086d-000000000b7b] 41684 1727204486.47102: sending task result for task 0affcd87-79f5-3839-086d-000000000b7b 41684 1727204486.47204: done sending task result for task 0affcd87-79f5-3839-086d-000000000b7b 41684 1727204486.47208: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41684 1727204486.47266: no more pending results, returning what we have 41684 1727204486.47272: results queue empty 41684 1727204486.47273: checking for any_errors_fatal 41684 1727204486.47286: done checking for any_errors_fatal 41684 1727204486.47287: checking for max_fail_percentage 41684 1727204486.47289: done checking for max_fail_percentage 41684 1727204486.47290: checking to see if all hosts have failed and the running result is not ok 41684 1727204486.47291: done checking to see if all hosts have failed 41684 1727204486.47292: getting the remaining hosts for this loop 41684 1727204486.47294: done getting the remaining hosts for this loop 41684 1727204486.47299: getting the next task for host managed-node1 41684 1727204486.47308: done getting next task for host managed-node1 41684 1727204486.47312: ^ task is: TASK: Get NM profile info 41684 1727204486.47318: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204486.47324: getting variables 41684 1727204486.47326: in VariableManager get_vars() 41684 1727204486.47377: Calling all_inventory to load vars for managed-node1 41684 1727204486.47381: Calling groups_inventory to load vars for managed-node1 41684 1727204486.47383: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204486.47398: Calling all_plugins_play to load vars for managed-node1 41684 1727204486.47400: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204486.47404: Calling groups_plugins_play to load vars for managed-node1 41684 1727204486.49213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204486.50819: done with get_vars() 41684 1727204486.50843: done getting variables 41684 1727204486.50904: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:01:26 -0400 (0:00:00.053) 0:00:42.910 ***** 41684 1727204486.50939: entering _queue_task() for managed-node1/shell 41684 1727204486.51254: worker is 1 (out of 1 available) 41684 1727204486.51269: exiting _queue_task() for managed-node1/shell 41684 1727204486.51282: done queuing things up, now waiting for results queue to drain 41684 1727204486.51284: waiting for pending results... 41684 1727204486.51573: running TaskExecutor() for managed-node1/TASK: Get NM profile info 41684 1727204486.51686: in run() - task 0affcd87-79f5-3839-086d-000000000b7c 41684 1727204486.51697: variable 'ansible_search_path' from source: unknown 41684 1727204486.51701: variable 'ansible_search_path' from source: unknown 41684 1727204486.51737: calling self._execute() 41684 1727204486.51826: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204486.51829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204486.51846: variable 'omit' from source: magic vars 41684 1727204486.52202: variable 'ansible_distribution_major_version' from source: facts 41684 1727204486.52214: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204486.52220: variable 'omit' from source: magic vars 41684 1727204486.52277: variable 'omit' from source: magic vars 41684 1727204486.52368: variable 'profile' from source: include params 41684 1727204486.52372: variable 'item' from source: include params 41684 1727204486.52435: variable 'item' from source: include params 41684 1727204486.52455: variable 'omit' from source: magic vars 41684 1727204486.52497: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204486.52533: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204486.52554: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204486.52573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204486.52584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204486.52619: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204486.52623: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204486.52625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204486.52726: Set connection var ansible_connection to ssh 41684 1727204486.52732: Set connection var ansible_pipelining to False 41684 1727204486.52738: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204486.52744: Set connection var ansible_timeout to 10 41684 1727204486.52753: Set connection var ansible_shell_executable to /bin/sh 41684 1727204486.52756: Set connection var ansible_shell_type to sh 41684 1727204486.52782: variable 'ansible_shell_executable' from source: unknown 41684 1727204486.52787: variable 'ansible_connection' from source: unknown 41684 1727204486.52789: variable 'ansible_module_compression' from source: unknown 41684 1727204486.52792: variable 'ansible_shell_type' from source: unknown 41684 1727204486.52794: variable 'ansible_shell_executable' from source: unknown 41684 1727204486.52796: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204486.52798: variable 'ansible_pipelining' from source: unknown 41684 1727204486.52801: variable 'ansible_timeout' from source: unknown 41684 1727204486.52806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204486.52948: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204486.52959: variable 'omit' from source: magic vars 41684 1727204486.52967: starting attempt loop 41684 1727204486.52970: running the handler 41684 1727204486.52979: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204486.52998: _low_level_execute_command(): starting 41684 1727204486.53006: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204486.53757: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204486.53775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.53788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.53808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.53847: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204486.53854: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204486.53868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.53880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204486.53890: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204486.53896: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204486.53906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.53920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.53933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.53941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204486.53948: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204486.53959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.54036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204486.54052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204486.54056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204486.54151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204486.55703: stdout chunk (state=3): >>>/root <<< 41684 1727204486.55810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204486.55913: stderr chunk (state=3): >>><<< 41684 1727204486.55935: stdout chunk (state=3): >>><<< 41684 1727204486.56073: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204486.56084: _low_level_execute_command(): starting 41684 1727204486.56088: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204486.5597415-44695-141790460642823 `" && echo ansible-tmp-1727204486.5597415-44695-141790460642823="` echo /root/.ansible/tmp/ansible-tmp-1727204486.5597415-44695-141790460642823 `" ) && sleep 0' 41684 1727204486.56790: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204486.56804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.56821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.56849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.56897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204486.56910: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204486.56925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.56959: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204486.56979: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204486.56991: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204486.57004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.57018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.57035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.57050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204486.57072: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204486.57087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.57172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204486.57199: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204486.57216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204486.57313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204486.59152: stdout chunk (state=3): >>>ansible-tmp-1727204486.5597415-44695-141790460642823=/root/.ansible/tmp/ansible-tmp-1727204486.5597415-44695-141790460642823 <<< 41684 1727204486.59348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204486.59352: stdout chunk (state=3): >>><<< 41684 1727204486.59354: stderr chunk (state=3): >>><<< 41684 1727204486.59441: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204486.5597415-44695-141790460642823=/root/.ansible/tmp/ansible-tmp-1727204486.5597415-44695-141790460642823 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204486.59444: variable 'ansible_module_compression' from source: unknown 41684 1727204486.59465: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204486.59510: variable 'ansible_facts' from source: unknown 41684 1727204486.59570: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204486.5597415-44695-141790460642823/AnsiballZ_command.py 41684 1727204486.59686: Sending initial data 41684 1727204486.59689: Sent initial data (156 bytes) 41684 1727204486.60339: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.60346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.60385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.60391: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.60401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.60407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204486.60415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.60472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204486.60486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204486.60551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204486.62220: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 41684 1727204486.62238: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 41684 1727204486.62242: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204486.62284: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 41684 1727204486.62291: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 41684 1727204486.62297: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 41684 1727204486.62355: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmpid1s11yi /root/.ansible/tmp/ansible-tmp-1727204486.5597415-44695-141790460642823/AnsiballZ_command.py <<< 41684 1727204486.62411: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204486.63300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204486.63359: stderr chunk (state=3): >>><<< 41684 1727204486.63370: stdout chunk (state=3): >>><<< 41684 1727204486.63383: done transferring module to remote 41684 1727204486.63391: _low_level_execute_command(): starting 41684 1727204486.63398: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204486.5597415-44695-141790460642823/ /root/.ansible/tmp/ansible-tmp-1727204486.5597415-44695-141790460642823/AnsiballZ_command.py && sleep 0' 41684 1727204486.63828: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.63844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.63875: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204486.63898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.63918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.63936: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204486.63947: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204486.63966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.64071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204486.64088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204486.64102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204486.64195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204486.65890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204486.65971: stderr chunk (state=3): >>><<< 41684 1727204486.65978: stdout chunk (state=3): >>><<< 41684 1727204486.65997: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204486.66000: _low_level_execute_command(): starting 41684 1727204486.66005: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204486.5597415-44695-141790460642823/AnsiballZ_command.py && sleep 0' 41684 1727204486.66656: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204486.66668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.66681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.66703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.66741: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204486.66749: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204486.66759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.66777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204486.66785: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204486.66791: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204486.66807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.66821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.66833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.66841: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204486.66849: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204486.66858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.66942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204486.66959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204486.66975: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204486.67059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204486.81840: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "start": "2024-09-24 15:01:26.799962", "end": "2024-09-24 15:01:26.817651", "delta": "0:00:00.017689", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204486.82971: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.148 closed. <<< 41684 1727204486.82975: stdout chunk (state=3): >>><<< 41684 1727204486.82981: stderr chunk (state=3): >>><<< 41684 1727204486.83003: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "start": "2024-09-24 15:01:26.799962", "end": "2024-09-24 15:01:26.817651", "delta": "0:00:00.017689", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.148 closed. 41684 1727204486.83041: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204486.5597415-44695-141790460642823/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204486.83049: _low_level_execute_command(): starting 41684 1727204486.83055: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204486.5597415-44695-141790460642823/ > /dev/null 2>&1 && sleep 0' 41684 1727204486.83668: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204486.83677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.83688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.83701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.83739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204486.83745: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204486.83756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.83771: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204486.83779: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204486.83785: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204486.83795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204486.83802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204486.83814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204486.83821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204486.83828: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204486.83837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204486.83917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204486.83925: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204486.83928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204486.84019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204486.85773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204486.85851: stderr chunk (state=3): >>><<< 41684 1727204486.85879: stdout chunk (state=3): >>><<< 41684 1727204486.85970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204486.85974: handler run complete 41684 1727204486.85978: Evaluated conditional (False): False 41684 1727204486.85980: attempt loop complete, returning result 41684 1727204486.85982: _execute() done 41684 1727204486.85984: dumping result to json 41684 1727204486.85986: done dumping result, returning 41684 1727204486.86175: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [0affcd87-79f5-3839-086d-000000000b7c] 41684 1727204486.86178: sending task result for task 0affcd87-79f5-3839-086d-000000000b7c 41684 1727204486.86253: done sending task result for task 0affcd87-79f5-3839-086d-000000000b7c 41684 1727204486.86256: WORKER PROCESS EXITING fatal: [managed-node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "delta": "0:00:00.017689", "end": "2024-09-24 15:01:26.817651", "rc": 1, "start": "2024-09-24 15:01:26.799962" } MSG: non-zero return code ...ignoring 41684 1727204486.86354: no more pending results, returning what we have 41684 1727204486.86359: results queue empty 41684 1727204486.86360: checking for any_errors_fatal 41684 1727204486.86379: done checking for any_errors_fatal 41684 1727204486.86380: checking for max_fail_percentage 41684 1727204486.86382: done checking for max_fail_percentage 41684 1727204486.86384: checking to see if all hosts have failed and the running result is not ok 41684 1727204486.86384: done checking to see if all hosts have failed 41684 1727204486.86385: getting the remaining hosts for this loop 41684 1727204486.86387: done getting the remaining hosts for this loop 41684 1727204486.86391: getting the next task for host managed-node1 41684 1727204486.86400: done getting next task for host managed-node1 41684 1727204486.86404: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 41684 1727204486.86409: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204486.86413: getting variables 41684 1727204486.86415: in VariableManager get_vars() 41684 1727204486.86469: Calling all_inventory to load vars for managed-node1 41684 1727204486.86472: Calling groups_inventory to load vars for managed-node1 41684 1727204486.86476: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204486.86492: Calling all_plugins_play to load vars for managed-node1 41684 1727204486.86495: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204486.86497: Calling groups_plugins_play to load vars for managed-node1 41684 1727204486.87784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204486.88706: done with get_vars() 41684 1727204486.88724: done getting variables 41684 1727204486.88771: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:01:26 -0400 (0:00:00.378) 0:00:43.289 ***** 41684 1727204486.88796: entering _queue_task() for managed-node1/set_fact 41684 1727204486.89157: worker is 1 (out of 1 available) 41684 1727204486.89172: exiting _queue_task() for managed-node1/set_fact 41684 1727204486.89184: done queuing things up, now waiting for results queue to drain 41684 1727204486.89185: waiting for pending results... 41684 1727204486.89986: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 41684 1727204486.89992: in run() - task 0affcd87-79f5-3839-086d-000000000b7d 41684 1727204486.89995: variable 'ansible_search_path' from source: unknown 41684 1727204486.89997: variable 'ansible_search_path' from source: unknown 41684 1727204486.90000: calling self._execute() 41684 1727204486.90003: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204486.90005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204486.90007: variable 'omit' from source: magic vars 41684 1727204486.90377: variable 'ansible_distribution_major_version' from source: facts 41684 1727204486.90382: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204486.90385: variable 'nm_profile_exists' from source: set_fact 41684 1727204486.90387: Evaluated conditional (nm_profile_exists.rc == 0): False 41684 1727204486.90388: when evaluation is False, skipping this task 41684 1727204486.90390: _execute() done 41684 1727204486.90392: dumping result to json 41684 1727204486.90394: done dumping result, returning 41684 1727204486.90396: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-3839-086d-000000000b7d] 41684 1727204486.90398: sending task result for task 0affcd87-79f5-3839-086d-000000000b7d 41684 1727204486.90460: done sending task result for task 0affcd87-79f5-3839-086d-000000000b7d 41684 1727204486.90467: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 41684 1727204486.90555: no more pending results, returning what we have 41684 1727204486.90560: results queue empty 41684 1727204486.90561: checking for any_errors_fatal 41684 1727204486.90570: done checking for any_errors_fatal 41684 1727204486.90571: checking for max_fail_percentage 41684 1727204486.90572: done checking for max_fail_percentage 41684 1727204486.90573: checking to see if all hosts have failed and the running result is not ok 41684 1727204486.90574: done checking to see if all hosts have failed 41684 1727204486.90575: getting the remaining hosts for this loop 41684 1727204486.90576: done getting the remaining hosts for this loop 41684 1727204486.90581: getting the next task for host managed-node1 41684 1727204486.90590: done getting next task for host managed-node1 41684 1727204486.90593: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 41684 1727204486.90600: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204486.90604: getting variables 41684 1727204486.90605: in VariableManager get_vars() 41684 1727204486.90641: Calling all_inventory to load vars for managed-node1 41684 1727204486.90644: Calling groups_inventory to load vars for managed-node1 41684 1727204486.90646: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204486.90656: Calling all_plugins_play to load vars for managed-node1 41684 1727204486.90658: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204486.90661: Calling groups_plugins_play to load vars for managed-node1 41684 1727204486.96185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204486.97102: done with get_vars() 41684 1727204486.97127: done getting variables 41684 1727204486.97166: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204486.97243: variable 'profile' from source: include params 41684 1727204486.97245: variable 'item' from source: include params 41684 1727204486.97290: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-ethtest1] *********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:01:26 -0400 (0:00:00.085) 0:00:43.374 ***** 41684 1727204486.97311: entering _queue_task() for managed-node1/command 41684 1727204486.97557: worker is 1 (out of 1 available) 41684 1727204486.97573: exiting _queue_task() for managed-node1/command 41684 1727204486.97586: done queuing things up, now waiting for results queue to drain 41684 1727204486.97588: waiting for pending results... 41684 1727204486.97780: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-ethtest1 41684 1727204486.97876: in run() - task 0affcd87-79f5-3839-086d-000000000b7f 41684 1727204486.97886: variable 'ansible_search_path' from source: unknown 41684 1727204486.97892: variable 'ansible_search_path' from source: unknown 41684 1727204486.97923: calling self._execute() 41684 1727204486.98005: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204486.98010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204486.98018: variable 'omit' from source: magic vars 41684 1727204486.98301: variable 'ansible_distribution_major_version' from source: facts 41684 1727204486.98310: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204486.98399: variable 'profile_stat' from source: set_fact 41684 1727204486.98409: Evaluated conditional (profile_stat.stat.exists): False 41684 1727204486.98412: when evaluation is False, skipping this task 41684 1727204486.98415: _execute() done 41684 1727204486.98418: dumping result to json 41684 1727204486.98420: done dumping result, returning 41684 1727204486.98425: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-ethtest1 [0affcd87-79f5-3839-086d-000000000b7f] 41684 1727204486.98432: sending task result for task 0affcd87-79f5-3839-086d-000000000b7f 41684 1727204486.98518: done sending task result for task 0affcd87-79f5-3839-086d-000000000b7f 41684 1727204486.98521: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41684 1727204486.98578: no more pending results, returning what we have 41684 1727204486.98582: results queue empty 41684 1727204486.98583: checking for any_errors_fatal 41684 1727204486.98589: done checking for any_errors_fatal 41684 1727204486.98590: checking for max_fail_percentage 41684 1727204486.98592: done checking for max_fail_percentage 41684 1727204486.98592: checking to see if all hosts have failed and the running result is not ok 41684 1727204486.98594: done checking to see if all hosts have failed 41684 1727204486.98594: getting the remaining hosts for this loop 41684 1727204486.98596: done getting the remaining hosts for this loop 41684 1727204486.98600: getting the next task for host managed-node1 41684 1727204486.98608: done getting next task for host managed-node1 41684 1727204486.98610: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 41684 1727204486.98616: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204486.98622: getting variables 41684 1727204486.98623: in VariableManager get_vars() 41684 1727204486.98662: Calling all_inventory to load vars for managed-node1 41684 1727204486.98672: Calling groups_inventory to load vars for managed-node1 41684 1727204486.98674: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204486.98685: Calling all_plugins_play to load vars for managed-node1 41684 1727204486.98687: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204486.98690: Calling groups_plugins_play to load vars for managed-node1 41684 1727204486.99505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204487.00456: done with get_vars() 41684 1727204487.00475: done getting variables 41684 1727204487.00520: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204487.00601: variable 'profile' from source: include params 41684 1727204487.00603: variable 'item' from source: include params 41684 1727204487.00645: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-ethtest1] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:01:27 -0400 (0:00:00.033) 0:00:43.408 ***** 41684 1727204487.00671: entering _queue_task() for managed-node1/set_fact 41684 1727204487.00897: worker is 1 (out of 1 available) 41684 1727204487.00910: exiting _queue_task() for managed-node1/set_fact 41684 1727204487.00925: done queuing things up, now waiting for results queue to drain 41684 1727204487.00927: waiting for pending results... 41684 1727204487.01109: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest1 41684 1727204487.01210: in run() - task 0affcd87-79f5-3839-086d-000000000b80 41684 1727204487.01221: variable 'ansible_search_path' from source: unknown 41684 1727204487.01223: variable 'ansible_search_path' from source: unknown 41684 1727204487.01252: calling self._execute() 41684 1727204487.01335: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204487.01338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204487.01347: variable 'omit' from source: magic vars 41684 1727204487.01627: variable 'ansible_distribution_major_version' from source: facts 41684 1727204487.01637: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204487.01725: variable 'profile_stat' from source: set_fact 41684 1727204487.01735: Evaluated conditional (profile_stat.stat.exists): False 41684 1727204487.01738: when evaluation is False, skipping this task 41684 1727204487.01740: _execute() done 41684 1727204487.01744: dumping result to json 41684 1727204487.01747: done dumping result, returning 41684 1727204487.01752: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest1 [0affcd87-79f5-3839-086d-000000000b80] 41684 1727204487.01758: sending task result for task 0affcd87-79f5-3839-086d-000000000b80 41684 1727204487.01847: done sending task result for task 0affcd87-79f5-3839-086d-000000000b80 41684 1727204487.01850: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41684 1727204487.01901: no more pending results, returning what we have 41684 1727204487.01905: results queue empty 41684 1727204487.01906: checking for any_errors_fatal 41684 1727204487.01913: done checking for any_errors_fatal 41684 1727204487.01914: checking for max_fail_percentage 41684 1727204487.01915: done checking for max_fail_percentage 41684 1727204487.01916: checking to see if all hosts have failed and the running result is not ok 41684 1727204487.01917: done checking to see if all hosts have failed 41684 1727204487.01918: getting the remaining hosts for this loop 41684 1727204487.01920: done getting the remaining hosts for this loop 41684 1727204487.01923: getting the next task for host managed-node1 41684 1727204487.01931: done getting next task for host managed-node1 41684 1727204487.01933: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 41684 1727204487.01938: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204487.01942: getting variables 41684 1727204487.01943: in VariableManager get_vars() 41684 1727204487.01990: Calling all_inventory to load vars for managed-node1 41684 1727204487.01993: Calling groups_inventory to load vars for managed-node1 41684 1727204487.01995: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204487.02005: Calling all_plugins_play to load vars for managed-node1 41684 1727204487.02007: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204487.02010: Calling groups_plugins_play to load vars for managed-node1 41684 1727204487.02921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204487.03849: done with get_vars() 41684 1727204487.03866: done getting variables 41684 1727204487.03908: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204487.03992: variable 'profile' from source: include params 41684 1727204487.03995: variable 'item' from source: include params 41684 1727204487.04037: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-ethtest1] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:01:27 -0400 (0:00:00.033) 0:00:43.442 ***** 41684 1727204487.04063: entering _queue_task() for managed-node1/command 41684 1727204487.04292: worker is 1 (out of 1 available) 41684 1727204487.04307: exiting _queue_task() for managed-node1/command 41684 1727204487.04321: done queuing things up, now waiting for results queue to drain 41684 1727204487.04322: waiting for pending results... 41684 1727204487.04507: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-ethtest1 41684 1727204487.04607: in run() - task 0affcd87-79f5-3839-086d-000000000b81 41684 1727204487.04618: variable 'ansible_search_path' from source: unknown 41684 1727204487.04622: variable 'ansible_search_path' from source: unknown 41684 1727204487.04650: calling self._execute() 41684 1727204487.04728: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204487.04732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204487.04741: variable 'omit' from source: magic vars 41684 1727204487.05019: variable 'ansible_distribution_major_version' from source: facts 41684 1727204487.05030: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204487.05115: variable 'profile_stat' from source: set_fact 41684 1727204487.05125: Evaluated conditional (profile_stat.stat.exists): False 41684 1727204487.05128: when evaluation is False, skipping this task 41684 1727204487.05131: _execute() done 41684 1727204487.05133: dumping result to json 41684 1727204487.05136: done dumping result, returning 41684 1727204487.05141: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-ethtest1 [0affcd87-79f5-3839-086d-000000000b81] 41684 1727204487.05147: sending task result for task 0affcd87-79f5-3839-086d-000000000b81 41684 1727204487.05233: done sending task result for task 0affcd87-79f5-3839-086d-000000000b81 41684 1727204487.05236: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41684 1727204487.05293: no more pending results, returning what we have 41684 1727204487.05297: results queue empty 41684 1727204487.05298: checking for any_errors_fatal 41684 1727204487.05305: done checking for any_errors_fatal 41684 1727204487.05306: checking for max_fail_percentage 41684 1727204487.05307: done checking for max_fail_percentage 41684 1727204487.05308: checking to see if all hosts have failed and the running result is not ok 41684 1727204487.05309: done checking to see if all hosts have failed 41684 1727204487.05310: getting the remaining hosts for this loop 41684 1727204487.05312: done getting the remaining hosts for this loop 41684 1727204487.05315: getting the next task for host managed-node1 41684 1727204487.05322: done getting next task for host managed-node1 41684 1727204487.05325: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 41684 1727204487.05330: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204487.05333: getting variables 41684 1727204487.05334: in VariableManager get_vars() 41684 1727204487.05383: Calling all_inventory to load vars for managed-node1 41684 1727204487.05386: Calling groups_inventory to load vars for managed-node1 41684 1727204487.05389: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204487.05398: Calling all_plugins_play to load vars for managed-node1 41684 1727204487.05400: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204487.05403: Calling groups_plugins_play to load vars for managed-node1 41684 1727204487.06229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204487.07280: done with get_vars() 41684 1727204487.07298: done getting variables 41684 1727204487.07344: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204487.07434: variable 'profile' from source: include params 41684 1727204487.07438: variable 'item' from source: include params 41684 1727204487.07483: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-ethtest1] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:01:27 -0400 (0:00:00.034) 0:00:43.476 ***** 41684 1727204487.07507: entering _queue_task() for managed-node1/set_fact 41684 1727204487.07757: worker is 1 (out of 1 available) 41684 1727204487.07773: exiting _queue_task() for managed-node1/set_fact 41684 1727204487.07787: done queuing things up, now waiting for results queue to drain 41684 1727204487.07789: waiting for pending results... 41684 1727204487.07982: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-ethtest1 41684 1727204487.08064: in run() - task 0affcd87-79f5-3839-086d-000000000b82 41684 1727204487.08079: variable 'ansible_search_path' from source: unknown 41684 1727204487.08082: variable 'ansible_search_path' from source: unknown 41684 1727204487.08112: calling self._execute() 41684 1727204487.08196: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204487.08199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204487.08209: variable 'omit' from source: magic vars 41684 1727204487.08496: variable 'ansible_distribution_major_version' from source: facts 41684 1727204487.08508: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204487.08601: variable 'profile_stat' from source: set_fact 41684 1727204487.08615: Evaluated conditional (profile_stat.stat.exists): False 41684 1727204487.08619: when evaluation is False, skipping this task 41684 1727204487.08622: _execute() done 41684 1727204487.08625: dumping result to json 41684 1727204487.08627: done dumping result, returning 41684 1727204487.08630: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-ethtest1 [0affcd87-79f5-3839-086d-000000000b82] 41684 1727204487.08635: sending task result for task 0affcd87-79f5-3839-086d-000000000b82 41684 1727204487.08725: done sending task result for task 0affcd87-79f5-3839-086d-000000000b82 41684 1727204487.08728: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41684 1727204487.08780: no more pending results, returning what we have 41684 1727204487.08784: results queue empty 41684 1727204487.08785: checking for any_errors_fatal 41684 1727204487.08792: done checking for any_errors_fatal 41684 1727204487.08792: checking for max_fail_percentage 41684 1727204487.08794: done checking for max_fail_percentage 41684 1727204487.08794: checking to see if all hosts have failed and the running result is not ok 41684 1727204487.08796: done checking to see if all hosts have failed 41684 1727204487.08796: getting the remaining hosts for this loop 41684 1727204487.08798: done getting the remaining hosts for this loop 41684 1727204487.08802: getting the next task for host managed-node1 41684 1727204487.08810: done getting next task for host managed-node1 41684 1727204487.08813: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 41684 1727204487.08818: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204487.08823: getting variables 41684 1727204487.08824: in VariableManager get_vars() 41684 1727204487.08875: Calling all_inventory to load vars for managed-node1 41684 1727204487.08878: Calling groups_inventory to load vars for managed-node1 41684 1727204487.08880: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204487.08890: Calling all_plugins_play to load vars for managed-node1 41684 1727204487.08892: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204487.08895: Calling groups_plugins_play to load vars for managed-node1 41684 1727204487.09723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204487.10657: done with get_vars() 41684 1727204487.10676: done getting variables 41684 1727204487.10720: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41684 1727204487.10808: variable 'profile' from source: include params 41684 1727204487.10810: variable 'item' from source: include params 41684 1727204487.10851: variable 'item' from source: include params TASK [Assert that the profile is absent - 'ethtest1'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 15:01:27 -0400 (0:00:00.033) 0:00:43.510 ***** 41684 1727204487.10878: entering _queue_task() for managed-node1/assert 41684 1727204487.11105: worker is 1 (out of 1 available) 41684 1727204487.11117: exiting _queue_task() for managed-node1/assert 41684 1727204487.11130: done queuing things up, now waiting for results queue to drain 41684 1727204487.11131: waiting for pending results... 41684 1727204487.11318: running TaskExecutor() for managed-node1/TASK: Assert that the profile is absent - 'ethtest1' 41684 1727204487.11402: in run() - task 0affcd87-79f5-3839-086d-000000000a72 41684 1727204487.11413: variable 'ansible_search_path' from source: unknown 41684 1727204487.11416: variable 'ansible_search_path' from source: unknown 41684 1727204487.11447: calling self._execute() 41684 1727204487.11526: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204487.11530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204487.11537: variable 'omit' from source: magic vars 41684 1727204487.11817: variable 'ansible_distribution_major_version' from source: facts 41684 1727204487.11827: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204487.11833: variable 'omit' from source: magic vars 41684 1727204487.11865: variable 'omit' from source: magic vars 41684 1727204487.11936: variable 'profile' from source: include params 41684 1727204487.11940: variable 'item' from source: include params 41684 1727204487.11994: variable 'item' from source: include params 41684 1727204487.12008: variable 'omit' from source: magic vars 41684 1727204487.12043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204487.12074: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204487.12094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204487.12109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204487.12118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204487.12141: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204487.12144: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204487.12147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204487.12221: Set connection var ansible_connection to ssh 41684 1727204487.12224: Set connection var ansible_pipelining to False 41684 1727204487.12229: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204487.12236: Set connection var ansible_timeout to 10 41684 1727204487.12241: Set connection var ansible_shell_executable to /bin/sh 41684 1727204487.12244: Set connection var ansible_shell_type to sh 41684 1727204487.12262: variable 'ansible_shell_executable' from source: unknown 41684 1727204487.12268: variable 'ansible_connection' from source: unknown 41684 1727204487.12271: variable 'ansible_module_compression' from source: unknown 41684 1727204487.12274: variable 'ansible_shell_type' from source: unknown 41684 1727204487.12276: variable 'ansible_shell_executable' from source: unknown 41684 1727204487.12279: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204487.12283: variable 'ansible_pipelining' from source: unknown 41684 1727204487.12285: variable 'ansible_timeout' from source: unknown 41684 1727204487.12289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204487.12393: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204487.12402: variable 'omit' from source: magic vars 41684 1727204487.12407: starting attempt loop 41684 1727204487.12410: running the handler 41684 1727204487.12498: variable 'lsr_net_profile_exists' from source: set_fact 41684 1727204487.12501: Evaluated conditional (not lsr_net_profile_exists): True 41684 1727204487.12507: handler run complete 41684 1727204487.12518: attempt loop complete, returning result 41684 1727204487.12521: _execute() done 41684 1727204487.12524: dumping result to json 41684 1727204487.12526: done dumping result, returning 41684 1727204487.12535: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is absent - 'ethtest1' [0affcd87-79f5-3839-086d-000000000a72] 41684 1727204487.12538: sending task result for task 0affcd87-79f5-3839-086d-000000000a72 41684 1727204487.12620: done sending task result for task 0affcd87-79f5-3839-086d-000000000a72 41684 1727204487.12623: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 41684 1727204487.12677: no more pending results, returning what we have 41684 1727204487.12681: results queue empty 41684 1727204487.12682: checking for any_errors_fatal 41684 1727204487.12688: done checking for any_errors_fatal 41684 1727204487.12689: checking for max_fail_percentage 41684 1727204487.12691: done checking for max_fail_percentage 41684 1727204487.12692: checking to see if all hosts have failed and the running result is not ok 41684 1727204487.12692: done checking to see if all hosts have failed 41684 1727204487.12693: getting the remaining hosts for this loop 41684 1727204487.12695: done getting the remaining hosts for this loop 41684 1727204487.12698: getting the next task for host managed-node1 41684 1727204487.12705: done getting next task for host managed-node1 41684 1727204487.12709: ^ task is: TASK: Verify network state restored to default 41684 1727204487.12711: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204487.12715: getting variables 41684 1727204487.12717: in VariableManager get_vars() 41684 1727204487.12762: Calling all_inventory to load vars for managed-node1 41684 1727204487.12765: Calling groups_inventory to load vars for managed-node1 41684 1727204487.12768: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204487.12779: Calling all_plugins_play to load vars for managed-node1 41684 1727204487.12782: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204487.12784: Calling groups_plugins_play to load vars for managed-node1 41684 1727204487.13735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204487.14651: done with get_vars() 41684 1727204487.14668: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:169 Tuesday 24 September 2024 15:01:27 -0400 (0:00:00.038) 0:00:43.548 ***** 41684 1727204487.14735: entering _queue_task() for managed-node1/include_tasks 41684 1727204487.14950: worker is 1 (out of 1 available) 41684 1727204487.14966: exiting _queue_task() for managed-node1/include_tasks 41684 1727204487.14980: done queuing things up, now waiting for results queue to drain 41684 1727204487.14981: waiting for pending results... 41684 1727204487.15163: running TaskExecutor() for managed-node1/TASK: Verify network state restored to default 41684 1727204487.15245: in run() - task 0affcd87-79f5-3839-086d-0000000000bb 41684 1727204487.15259: variable 'ansible_search_path' from source: unknown 41684 1727204487.15290: calling self._execute() 41684 1727204487.15362: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204487.15374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204487.15378: variable 'omit' from source: magic vars 41684 1727204487.15646: variable 'ansible_distribution_major_version' from source: facts 41684 1727204487.15657: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204487.15662: _execute() done 41684 1727204487.15670: dumping result to json 41684 1727204487.15673: done dumping result, returning 41684 1727204487.15678: done running TaskExecutor() for managed-node1/TASK: Verify network state restored to default [0affcd87-79f5-3839-086d-0000000000bb] 41684 1727204487.15689: sending task result for task 0affcd87-79f5-3839-086d-0000000000bb 41684 1727204487.15773: done sending task result for task 0affcd87-79f5-3839-086d-0000000000bb 41684 1727204487.15776: WORKER PROCESS EXITING 41684 1727204487.15810: no more pending results, returning what we have 41684 1727204487.15815: in VariableManager get_vars() 41684 1727204487.15860: Calling all_inventory to load vars for managed-node1 41684 1727204487.15862: Calling groups_inventory to load vars for managed-node1 41684 1727204487.15866: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204487.15877: Calling all_plugins_play to load vars for managed-node1 41684 1727204487.15880: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204487.15883: Calling groups_plugins_play to load vars for managed-node1 41684 1727204487.16679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204487.17605: done with get_vars() 41684 1727204487.17619: variable 'ansible_search_path' from source: unknown 41684 1727204487.17629: we have included files to process 41684 1727204487.17630: generating all_blocks data 41684 1727204487.17631: done generating all_blocks data 41684 1727204487.17635: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 41684 1727204487.17635: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 41684 1727204487.17637: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 41684 1727204487.17904: done processing included file 41684 1727204487.17905: iterating over new_blocks loaded from include file 41684 1727204487.17906: in VariableManager get_vars() 41684 1727204487.17919: done with get_vars() 41684 1727204487.17920: filtering new block on tags 41684 1727204487.17940: done filtering new block on tags 41684 1727204487.17941: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node1 41684 1727204487.17945: extending task lists for all hosts with included blocks 41684 1727204487.19037: done extending task lists 41684 1727204487.19039: done processing included files 41684 1727204487.19039: results queue empty 41684 1727204487.19040: checking for any_errors_fatal 41684 1727204487.19042: done checking for any_errors_fatal 41684 1727204487.19042: checking for max_fail_percentage 41684 1727204487.19043: done checking for max_fail_percentage 41684 1727204487.19043: checking to see if all hosts have failed and the running result is not ok 41684 1727204487.19044: done checking to see if all hosts have failed 41684 1727204487.19044: getting the remaining hosts for this loop 41684 1727204487.19045: done getting the remaining hosts for this loop 41684 1727204487.19047: getting the next task for host managed-node1 41684 1727204487.19051: done getting next task for host managed-node1 41684 1727204487.19053: ^ task is: TASK: Check routes and DNS 41684 1727204487.19055: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204487.19057: getting variables 41684 1727204487.19057: in VariableManager get_vars() 41684 1727204487.19069: Calling all_inventory to load vars for managed-node1 41684 1727204487.19071: Calling groups_inventory to load vars for managed-node1 41684 1727204487.19072: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204487.19076: Calling all_plugins_play to load vars for managed-node1 41684 1727204487.19078: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204487.19079: Calling groups_plugins_play to load vars for managed-node1 41684 1727204487.19771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204487.20683: done with get_vars() 41684 1727204487.20699: done getting variables 41684 1727204487.20729: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 15:01:27 -0400 (0:00:00.060) 0:00:43.609 ***** 41684 1727204487.20750: entering _queue_task() for managed-node1/shell 41684 1727204487.21003: worker is 1 (out of 1 available) 41684 1727204487.21015: exiting _queue_task() for managed-node1/shell 41684 1727204487.21029: done queuing things up, now waiting for results queue to drain 41684 1727204487.21031: waiting for pending results... 41684 1727204487.21230: running TaskExecutor() for managed-node1/TASK: Check routes and DNS 41684 1727204487.21368: in run() - task 0affcd87-79f5-3839-086d-000000000bb6 41684 1727204487.21375: variable 'ansible_search_path' from source: unknown 41684 1727204487.21378: variable 'ansible_search_path' from source: unknown 41684 1727204487.21409: calling self._execute() 41684 1727204487.21489: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204487.21494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204487.21501: variable 'omit' from source: magic vars 41684 1727204487.21787: variable 'ansible_distribution_major_version' from source: facts 41684 1727204487.21797: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204487.21803: variable 'omit' from source: magic vars 41684 1727204487.21834: variable 'omit' from source: magic vars 41684 1727204487.21857: variable 'omit' from source: magic vars 41684 1727204487.21895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204487.21922: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204487.21943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204487.21957: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204487.21970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204487.22000: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204487.22003: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204487.22006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204487.22075: Set connection var ansible_connection to ssh 41684 1727204487.22080: Set connection var ansible_pipelining to False 41684 1727204487.22085: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204487.22090: Set connection var ansible_timeout to 10 41684 1727204487.22097: Set connection var ansible_shell_executable to /bin/sh 41684 1727204487.22100: Set connection var ansible_shell_type to sh 41684 1727204487.22119: variable 'ansible_shell_executable' from source: unknown 41684 1727204487.22122: variable 'ansible_connection' from source: unknown 41684 1727204487.22125: variable 'ansible_module_compression' from source: unknown 41684 1727204487.22127: variable 'ansible_shell_type' from source: unknown 41684 1727204487.22130: variable 'ansible_shell_executable' from source: unknown 41684 1727204487.22132: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204487.22134: variable 'ansible_pipelining' from source: unknown 41684 1727204487.22137: variable 'ansible_timeout' from source: unknown 41684 1727204487.22141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204487.22244: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204487.22254: variable 'omit' from source: magic vars 41684 1727204487.22257: starting attempt loop 41684 1727204487.22259: running the handler 41684 1727204487.22272: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204487.22287: _low_level_execute_command(): starting 41684 1727204487.22294: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204487.22832: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.22844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.22882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204487.22889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 41684 1727204487.23017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204487.23021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.23027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204487.23030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204487.23090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204487.24683: stdout chunk (state=3): >>>/root <<< 41684 1727204487.24792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204487.24838: stderr chunk (state=3): >>><<< 41684 1727204487.24842: stdout chunk (state=3): >>><<< 41684 1727204487.24866: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204487.24881: _low_level_execute_command(): starting 41684 1727204487.24887: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204487.2486885-44723-53387745318192 `" && echo ansible-tmp-1727204487.2486885-44723-53387745318192="` echo /root/.ansible/tmp/ansible-tmp-1727204487.2486885-44723-53387745318192 `" ) && sleep 0' 41684 1727204487.25317: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.25329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.25374: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.25378: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204487.25380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.25383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.25436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204487.25439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204487.25505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204487.27345: stdout chunk (state=3): >>>ansible-tmp-1727204487.2486885-44723-53387745318192=/root/.ansible/tmp/ansible-tmp-1727204487.2486885-44723-53387745318192 <<< 41684 1727204487.27460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204487.27533: stderr chunk (state=3): >>><<< 41684 1727204487.27536: stdout chunk (state=3): >>><<< 41684 1727204487.27554: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204487.2486885-44723-53387745318192=/root/.ansible/tmp/ansible-tmp-1727204487.2486885-44723-53387745318192 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204487.27591: variable 'ansible_module_compression' from source: unknown 41684 1727204487.27647: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204487.27684: variable 'ansible_facts' from source: unknown 41684 1727204487.27773: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204487.2486885-44723-53387745318192/AnsiballZ_command.py 41684 1727204487.27912: Sending initial data 41684 1727204487.27916: Sent initial data (155 bytes) 41684 1727204487.28835: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204487.28844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204487.28853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.28869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.28904: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204487.28911: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204487.28920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.28932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204487.28940: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204487.28947: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204487.28954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204487.28966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.28984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.28993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204487.29000: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204487.29007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.29078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204487.29092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204487.29121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204487.29180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204487.30887: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204487.30941: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204487.30998: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmphmkdtdn_ /root/.ansible/tmp/ansible-tmp-1727204487.2486885-44723-53387745318192/AnsiballZ_command.py <<< 41684 1727204487.31047: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204487.32296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204487.32378: stderr chunk (state=3): >>><<< 41684 1727204487.32381: stdout chunk (state=3): >>><<< 41684 1727204487.32401: done transferring module to remote 41684 1727204487.32412: _low_level_execute_command(): starting 41684 1727204487.32417: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204487.2486885-44723-53387745318192/ /root/.ansible/tmp/ansible-tmp-1727204487.2486885-44723-53387745318192/AnsiballZ_command.py && sleep 0' 41684 1727204487.33047: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204487.33055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204487.33069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.33080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.33119: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204487.33128: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204487.33140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.33152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204487.33159: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204487.33167: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204487.33176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204487.33185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.33197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.33202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204487.33209: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204487.33219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.33289: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204487.33306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204487.33315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204487.33395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204487.35155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204487.35158: stdout chunk (state=3): >>><<< 41684 1727204487.35167: stderr chunk (state=3): >>><<< 41684 1727204487.35186: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204487.35189: _low_level_execute_command(): starting 41684 1727204487.35192: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204487.2486885-44723-53387745318192/AnsiballZ_command.py && sleep 0' 41684 1727204487.35883: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204487.35892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204487.35903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.35916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.35956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204487.35969: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204487.35972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.35987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204487.35994: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204487.36001: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204487.36009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204487.36019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.36032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.36039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204487.36047: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204487.36057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.36134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204487.36148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204487.36159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204487.36253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204487.50195: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:8f:92:e7:c1:ab brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.148/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2880sec preferred_lft 2880sec\n inet6 fe80::108f:92ff:fee7:c1ab/64 scope link \n valid_lft forever preferred_lft forever\n21: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 4a:d1:a2:43:cd:1d brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.148 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.148 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:01:27.492853", "end": "2024-09-24 15:01:27.500959", "delta": "0:00:00.008106", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204487.51288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204487.51373: stderr chunk (state=3): >>><<< 41684 1727204487.51471: stdout chunk (state=3): >>><<< 41684 1727204487.51475: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:8f:92:e7:c1:ab brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.148/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2880sec preferred_lft 2880sec\n inet6 fe80::108f:92ff:fee7:c1ab/64 scope link \n valid_lft forever preferred_lft forever\n21: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 4a:d1:a2:43:cd:1d brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.148 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.148 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:01:27.492853", "end": "2024-09-24 15:01:27.500959", "delta": "0:00:00.008106", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204487.51478: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204487.2486885-44723-53387745318192/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204487.51481: _low_level_execute_command(): starting 41684 1727204487.51483: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204487.2486885-44723-53387745318192/ > /dev/null 2>&1 && sleep 0' 41684 1727204487.52144: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204487.52159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204487.52179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.52197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.52239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204487.52252: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204487.52272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.52290: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204487.52301: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204487.52312: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204487.52324: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204487.52336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.52351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.52367: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204487.52379: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204487.52392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.52473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204487.52496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204487.52511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204487.52597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204487.54366: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204487.54443: stderr chunk (state=3): >>><<< 41684 1727204487.54478: stdout chunk (state=3): >>><<< 41684 1727204487.54676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204487.54687: handler run complete 41684 1727204487.54690: Evaluated conditional (False): False 41684 1727204487.54692: attempt loop complete, returning result 41684 1727204487.54694: _execute() done 41684 1727204487.54697: dumping result to json 41684 1727204487.54699: done dumping result, returning 41684 1727204487.54701: done running TaskExecutor() for managed-node1/TASK: Check routes and DNS [0affcd87-79f5-3839-086d-000000000bb6] 41684 1727204487.54703: sending task result for task 0affcd87-79f5-3839-086d-000000000bb6 41684 1727204487.54795: done sending task result for task 0affcd87-79f5-3839-086d-000000000bb6 41684 1727204487.54798: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008106", "end": "2024-09-24 15:01:27.500959", "rc": 0, "start": "2024-09-24 15:01:27.492853" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:8f:92:e7:c1:ab brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.148/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2880sec preferred_lft 2880sec inet6 fe80::108f:92ff:fee7:c1ab/64 scope link valid_lft forever preferred_lft forever 21: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000 link/ether 4a:d1:a2:43:cd:1d brd ff:ff:ff:ff:ff:ff inet 192.0.2.72/31 scope global noprefixroute rpltstbr valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.148 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.148 metric 100 192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 41684 1727204487.54878: no more pending results, returning what we have 41684 1727204487.54883: results queue empty 41684 1727204487.54884: checking for any_errors_fatal 41684 1727204487.54886: done checking for any_errors_fatal 41684 1727204487.54887: checking for max_fail_percentage 41684 1727204487.54888: done checking for max_fail_percentage 41684 1727204487.54889: checking to see if all hosts have failed and the running result is not ok 41684 1727204487.54890: done checking to see if all hosts have failed 41684 1727204487.54891: getting the remaining hosts for this loop 41684 1727204487.54893: done getting the remaining hosts for this loop 41684 1727204487.54897: getting the next task for host managed-node1 41684 1727204487.54904: done getting next task for host managed-node1 41684 1727204487.54908: ^ task is: TASK: Verify DNS and network connectivity 41684 1727204487.54912: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41684 1727204487.54916: getting variables 41684 1727204487.54918: in VariableManager get_vars() 41684 1727204487.54972: Calling all_inventory to load vars for managed-node1 41684 1727204487.54975: Calling groups_inventory to load vars for managed-node1 41684 1727204487.54978: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204487.54990: Calling all_plugins_play to load vars for managed-node1 41684 1727204487.54992: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204487.54995: Calling groups_plugins_play to load vars for managed-node1 41684 1727204487.56325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204487.57275: done with get_vars() 41684 1727204487.57292: done getting variables 41684 1727204487.57358: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 15:01:27 -0400 (0:00:00.366) 0:00:43.975 ***** 41684 1727204487.57402: entering _queue_task() for managed-node1/shell 41684 1727204487.57779: worker is 1 (out of 1 available) 41684 1727204487.57798: exiting _queue_task() for managed-node1/shell 41684 1727204487.57811: done queuing things up, now waiting for results queue to drain 41684 1727204487.57812: waiting for pending results... 41684 1727204487.58154: running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity 41684 1727204487.58392: in run() - task 0affcd87-79f5-3839-086d-000000000bb7 41684 1727204487.58408: variable 'ansible_search_path' from source: unknown 41684 1727204487.58413: variable 'ansible_search_path' from source: unknown 41684 1727204487.58460: calling self._execute() 41684 1727204487.58545: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204487.58549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204487.58567: variable 'omit' from source: magic vars 41684 1727204487.58850: variable 'ansible_distribution_major_version' from source: facts 41684 1727204487.58860: Evaluated conditional (ansible_distribution_major_version != '6'): True 41684 1727204487.58957: variable 'ansible_facts' from source: unknown 41684 1727204487.59420: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 41684 1727204487.59424: variable 'omit' from source: magic vars 41684 1727204487.59458: variable 'omit' from source: magic vars 41684 1727204487.59484: variable 'omit' from source: magic vars 41684 1727204487.59516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41684 1727204487.59546: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41684 1727204487.59570: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41684 1727204487.59583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204487.59592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41684 1727204487.59614: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41684 1727204487.59617: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204487.59621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204487.59698: Set connection var ansible_connection to ssh 41684 1727204487.59703: Set connection var ansible_pipelining to False 41684 1727204487.59709: Set connection var ansible_module_compression to ZIP_DEFLATED 41684 1727204487.59714: Set connection var ansible_timeout to 10 41684 1727204487.59720: Set connection var ansible_shell_executable to /bin/sh 41684 1727204487.59723: Set connection var ansible_shell_type to sh 41684 1727204487.59741: variable 'ansible_shell_executable' from source: unknown 41684 1727204487.59744: variable 'ansible_connection' from source: unknown 41684 1727204487.59746: variable 'ansible_module_compression' from source: unknown 41684 1727204487.59749: variable 'ansible_shell_type' from source: unknown 41684 1727204487.59752: variable 'ansible_shell_executable' from source: unknown 41684 1727204487.59755: variable 'ansible_host' from source: host vars for 'managed-node1' 41684 1727204487.59757: variable 'ansible_pipelining' from source: unknown 41684 1727204487.59759: variable 'ansible_timeout' from source: unknown 41684 1727204487.59769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41684 1727204487.59861: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204487.59877: variable 'omit' from source: magic vars 41684 1727204487.59880: starting attempt loop 41684 1727204487.59882: running the handler 41684 1727204487.59892: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41684 1727204487.59908: _low_level_execute_command(): starting 41684 1727204487.59915: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41684 1727204487.60569: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.60574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 41684 1727204487.60673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.60731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204487.60735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204487.60738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204487.60789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204487.62319: stdout chunk (state=3): >>>/root <<< 41684 1727204487.62450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204487.62518: stderr chunk (state=3): >>><<< 41684 1727204487.62527: stdout chunk (state=3): >>><<< 41684 1727204487.62571: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204487.62592: _low_level_execute_command(): starting 41684 1727204487.62602: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204487.6257854-44738-153734908748039 `" && echo ansible-tmp-1727204487.6257854-44738-153734908748039="` echo /root/.ansible/tmp/ansible-tmp-1727204487.6257854-44738-153734908748039 `" ) && sleep 0' 41684 1727204487.63315: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204487.63332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204487.63357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.63381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.63430: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204487.63446: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204487.63461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.63485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204487.63498: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204487.63509: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204487.63522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204487.63546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.63569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.63584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204487.63595: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204487.63610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.63701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204487.63723: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204487.63739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204487.63826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204487.65652: stdout chunk (state=3): >>>ansible-tmp-1727204487.6257854-44738-153734908748039=/root/.ansible/tmp/ansible-tmp-1727204487.6257854-44738-153734908748039 <<< 41684 1727204487.65766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204487.65854: stderr chunk (state=3): >>><<< 41684 1727204487.65870: stdout chunk (state=3): >>><<< 41684 1727204487.65975: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204487.6257854-44738-153734908748039=/root/.ansible/tmp/ansible-tmp-1727204487.6257854-44738-153734908748039 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204487.65979: variable 'ansible_module_compression' from source: unknown 41684 1727204487.66174: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-41684fyviudxd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41684 1727204487.66485: variable 'ansible_facts' from source: unknown 41684 1727204487.66596: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204487.6257854-44738-153734908748039/AnsiballZ_command.py 41684 1727204487.67323: Sending initial data 41684 1727204487.67326: Sent initial data (156 bytes) 41684 1727204487.69311: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204487.69326: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204487.69351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.69375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.69418: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204487.69429: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204487.69452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.69476: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204487.69489: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204487.69501: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204487.69514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204487.69529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.69546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.69571: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204487.69584: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204487.69598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.69683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204487.69710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204487.69726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204487.69818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204487.71511: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41684 1727204487.71573: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 41684 1727204487.71621: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-41684fyviudxd/tmphk1z3zzo /root/.ansible/tmp/ansible-tmp-1727204487.6257854-44738-153734908748039/AnsiballZ_command.py <<< 41684 1727204487.71686: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 41684 1727204487.72980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204487.73170: stderr chunk (state=3): >>><<< 41684 1727204487.73174: stdout chunk (state=3): >>><<< 41684 1727204487.73361: done transferring module to remote 41684 1727204487.73368: _low_level_execute_command(): starting 41684 1727204487.73371: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204487.6257854-44738-153734908748039/ /root/.ansible/tmp/ansible-tmp-1727204487.6257854-44738-153734908748039/AnsiballZ_command.py && sleep 0' 41684 1727204487.75934: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204487.76586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204487.76596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.76611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.76651: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204487.76659: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204487.76673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.76684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204487.76692: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204487.76699: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204487.76706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204487.76717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.76728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.76736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204487.76743: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204487.76752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.76829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204487.76851: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204487.76870: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204487.76951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204487.78726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204487.78730: stdout chunk (state=3): >>><<< 41684 1727204487.78739: stderr chunk (state=3): >>><<< 41684 1727204487.78754: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204487.78757: _low_level_execute_command(): starting 41684 1727204487.78768: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204487.6257854-44738-153734908748039/AnsiballZ_command.py && sleep 0' 41684 1727204487.80716: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41684 1727204487.80785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204487.80802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.80827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.80875: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204487.80946: stderr chunk (state=3): >>>debug2: match not found <<< 41684 1727204487.80959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.80981: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41684 1727204487.80994: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 41684 1727204487.81005: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41684 1727204487.81017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204487.81031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204487.81053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204487.81069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 41684 1727204487.81080: stderr chunk (state=3): >>>debug2: match found <<< 41684 1727204487.81093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204487.81282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 41684 1727204487.81306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41684 1727204487.81324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204487.81415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204488.21461: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1343 0 --:--:-- --:--:-- --:--:-- 1337\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 13857 0 --:--:-- --:--:-- --:--:-- 13857", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 15:01:27.942508", "end": "2024-09-24 15:01:28.213622", "delta": "0:00:00.271114", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41684 1727204488.22695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 41684 1727204488.22752: stderr chunk (state=3): >>><<< 41684 1727204488.22755: stdout chunk (state=3): >>><<< 41684 1727204488.22780: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1343 0 --:--:-- --:--:-- --:--:-- 1337\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 13857 0 --:--:-- --:--:-- --:--:-- 13857", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 15:01:27.942508", "end": "2024-09-24 15:01:28.213622", "delta": "0:00:00.271114", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 41684 1727204488.22812: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204487.6257854-44738-153734908748039/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41684 1727204488.22821: _low_level_execute_command(): starting 41684 1727204488.22826: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204487.6257854-44738-153734908748039/ > /dev/null 2>&1 && sleep 0' 41684 1727204488.23293: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41684 1727204488.23297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41684 1727204488.23350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 41684 1727204488.23354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204488.23356: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41684 1727204488.23358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41684 1727204488.23467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41684 1727204488.23507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41684 1727204488.25306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41684 1727204488.25409: stderr chunk (state=3): >>><<< 41684 1727204488.25413: stdout chunk (state=3): >>><<< 41684 1727204488.25415: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41684 1727204488.25422: handler run complete 41684 1727204488.25470: Evaluated conditional (False): False 41684 1727204488.25630: attempt loop complete, returning result 41684 1727204488.25633: _execute() done 41684 1727204488.25635: dumping result to json 41684 1727204488.25637: done dumping result, returning 41684 1727204488.25638: done running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity [0affcd87-79f5-3839-086d-000000000bb7] 41684 1727204488.25640: sending task result for task 0affcd87-79f5-3839-086d-000000000bb7 41684 1727204488.25720: done sending task result for task 0affcd87-79f5-3839-086d-000000000bb7 41684 1727204488.25723: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.271114", "end": "2024-09-24 15:01:28.213622", "rc": 0, "start": "2024-09-24 15:01:27.942508" } STDOUT: CHECK DNS AND CONNECTIVITY 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1343 0 --:--:-- --:--:-- --:--:-- 1337 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 13857 0 --:--:-- --:--:-- --:--:-- 13857 41684 1727204488.25812: no more pending results, returning what we have 41684 1727204488.25816: results queue empty 41684 1727204488.25818: checking for any_errors_fatal 41684 1727204488.25831: done checking for any_errors_fatal 41684 1727204488.25832: checking for max_fail_percentage 41684 1727204488.25834: done checking for max_fail_percentage 41684 1727204488.25834: checking to see if all hosts have failed and the running result is not ok 41684 1727204488.25835: done checking to see if all hosts have failed 41684 1727204488.25836: getting the remaining hosts for this loop 41684 1727204488.25838: done getting the remaining hosts for this loop 41684 1727204488.25843: getting the next task for host managed-node1 41684 1727204488.25854: done getting next task for host managed-node1 41684 1727204488.25857: ^ task is: TASK: meta (flush_handlers) 41684 1727204488.25859: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204488.25862: getting variables 41684 1727204488.25867: in VariableManager get_vars() 41684 1727204488.25923: Calling all_inventory to load vars for managed-node1 41684 1727204488.25926: Calling groups_inventory to load vars for managed-node1 41684 1727204488.25928: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204488.25941: Calling all_plugins_play to load vars for managed-node1 41684 1727204488.25943: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204488.25946: Calling groups_plugins_play to load vars for managed-node1 41684 1727204488.27231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204488.28172: done with get_vars() 41684 1727204488.28191: done getting variables 41684 1727204488.28240: in VariableManager get_vars() 41684 1727204488.28250: Calling all_inventory to load vars for managed-node1 41684 1727204488.28251: Calling groups_inventory to load vars for managed-node1 41684 1727204488.28253: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204488.28256: Calling all_plugins_play to load vars for managed-node1 41684 1727204488.28257: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204488.28259: Calling groups_plugins_play to load vars for managed-node1 41684 1727204488.30104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204488.31825: done with get_vars() 41684 1727204488.31852: done queuing things up, now waiting for results queue to drain 41684 1727204488.31854: results queue empty 41684 1727204488.31855: checking for any_errors_fatal 41684 1727204488.31859: done checking for any_errors_fatal 41684 1727204488.31860: checking for max_fail_percentage 41684 1727204488.31861: done checking for max_fail_percentage 41684 1727204488.31865: checking to see if all hosts have failed and the running result is not ok 41684 1727204488.31866: done checking to see if all hosts have failed 41684 1727204488.31867: getting the remaining hosts for this loop 41684 1727204488.31868: done getting the remaining hosts for this loop 41684 1727204488.31871: getting the next task for host managed-node1 41684 1727204488.31875: done getting next task for host managed-node1 41684 1727204488.31876: ^ task is: TASK: meta (flush_handlers) 41684 1727204488.31878: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204488.31881: getting variables 41684 1727204488.31882: in VariableManager get_vars() 41684 1727204488.31896: Calling all_inventory to load vars for managed-node1 41684 1727204488.31898: Calling groups_inventory to load vars for managed-node1 41684 1727204488.31900: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204488.31905: Calling all_plugins_play to load vars for managed-node1 41684 1727204488.31908: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204488.31910: Calling groups_plugins_play to load vars for managed-node1 41684 1727204488.33085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204488.34708: done with get_vars() 41684 1727204488.34730: done getting variables 41684 1727204488.34787: in VariableManager get_vars() 41684 1727204488.34802: Calling all_inventory to load vars for managed-node1 41684 1727204488.34805: Calling groups_inventory to load vars for managed-node1 41684 1727204488.34807: Calling all_plugins_inventory to load vars for managed-node1 41684 1727204488.34816: Calling all_plugins_play to load vars for managed-node1 41684 1727204488.34819: Calling groups_plugins_inventory to load vars for managed-node1 41684 1727204488.34822: Calling groups_plugins_play to load vars for managed-node1 41684 1727204488.36082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41684 1727204488.37740: done with get_vars() 41684 1727204488.37770: done queuing things up, now waiting for results queue to drain 41684 1727204488.37772: results queue empty 41684 1727204488.37773: checking for any_errors_fatal 41684 1727204488.37774: done checking for any_errors_fatal 41684 1727204488.37775: checking for max_fail_percentage 41684 1727204488.37776: done checking for max_fail_percentage 41684 1727204488.37777: checking to see if all hosts have failed and the running result is not ok 41684 1727204488.37778: done checking to see if all hosts have failed 41684 1727204488.37779: getting the remaining hosts for this loop 41684 1727204488.37780: done getting the remaining hosts for this loop 41684 1727204488.37783: getting the next task for host managed-node1 41684 1727204488.37786: done getting next task for host managed-node1 41684 1727204488.37787: ^ task is: None 41684 1727204488.37789: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41684 1727204488.37790: done queuing things up, now waiting for results queue to drain 41684 1727204488.37791: results queue empty 41684 1727204488.37791: checking for any_errors_fatal 41684 1727204488.37792: done checking for any_errors_fatal 41684 1727204488.37793: checking for max_fail_percentage 41684 1727204488.37794: done checking for max_fail_percentage 41684 1727204488.37795: checking to see if all hosts have failed and the running result is not ok 41684 1727204488.37795: done checking to see if all hosts have failed 41684 1727204488.37797: getting the next task for host managed-node1 41684 1727204488.37799: done getting next task for host managed-node1 41684 1727204488.37800: ^ task is: None 41684 1727204488.37801: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node1 : ok=108 changed=3 unreachable=0 failed=0 skipped=87 rescued=0 ignored=2 Tuesday 24 September 2024 15:01:28 -0400 (0:00:00.804) 0:00:44.780 ***** =============================================================================== Gathering Facts --------------------------------------------------------- 2.03s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:3 Install iproute --------------------------------------------------------- 1.83s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which services are running ---- 1.70s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.62s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:6 fedora.linux_system_roles.network : Check which services are running ---- 1.60s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.51s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Get the interface1 MAC address ------------------------------------------ 1.45s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:99 Install iproute --------------------------------------------------------- 1.22s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.22s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Create veth interface ethtest0 ------------------------------------------ 1.22s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 1.02s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Create veth interface ethtest1 ------------------------------------------ 1.00s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 0.97s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.92s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.85s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.83s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Verify DNS and network connectivity ------------------------------------- 0.80s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Check which packages are installed --- 0.75s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gather the minimum subset of ansible_facts required by the network role test --- 0.74s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Gather current interface info ------------------------------------------- 0.69s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 41684 1727204488.37948: RUNNING CLEANUP