[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 11044 1726853235.30683: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Qi7 executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 11044 1726853235.30974: Added group all to inventory 11044 1726853235.30976: Added group ungrouped to inventory 11044 1726853235.30978: Group all now contains ungrouped 11044 1726853235.30981: Examining possible inventory source: /tmp/network-iHm/inventory.yml 11044 1726853235.44504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 11044 1726853235.44549: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 11044 1726853235.44565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 11044 1726853235.44605: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 11044 1726853235.44656: Loaded config def from plugin (inventory/script) 11044 1726853235.44657: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 11044 1726853235.44686: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 11044 1726853235.44743: Loaded config def from plugin (inventory/yaml) 11044 1726853235.44745: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 11044 1726853235.44804: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 11044 1726853235.45082: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 11044 1726853235.45084: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 11044 1726853235.45088: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 11044 1726853235.45092: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 11044 1726853235.45095: Loading data from /tmp/network-iHm/inventory.yml 11044 1726853235.45135: /tmp/network-iHm/inventory.yml was not parsable by auto 11044 1726853235.45181: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 11044 1726853235.45209: Loading data from /tmp/network-iHm/inventory.yml 11044 1726853235.45260: group all already in inventory 11044 1726853235.45265: set inventory_file for managed_node1 11044 1726853235.45268: set inventory_dir for managed_node1 11044 1726853235.45269: Added host managed_node1 to inventory 11044 1726853235.45272: Added host managed_node1 to group all 11044 1726853235.45273: set ansible_host for managed_node1 11044 1726853235.45274: set ansible_ssh_extra_args for managed_node1 11044 1726853235.45277: set inventory_file for managed_node2 11044 1726853235.45282: set inventory_dir for managed_node2 11044 1726853235.45283: Added host managed_node2 to inventory 11044 1726853235.45285: Added host managed_node2 to group all 11044 1726853235.45286: set ansible_host for managed_node2 11044 1726853235.45286: set ansible_ssh_extra_args for managed_node2 11044 1726853235.45289: set inventory_file for managed_node3 11044 1726853235.45292: set inventory_dir for managed_node3 11044 1726853235.45292: Added host managed_node3 to inventory 11044 1726853235.45294: Added host managed_node3 to group all 11044 1726853235.45294: set ansible_host for managed_node3 11044 1726853235.45295: set ansible_ssh_extra_args for managed_node3 11044 1726853235.45297: Reconcile groups and hosts in inventory. 11044 1726853235.45307: Group ungrouped now contains managed_node1 11044 1726853235.45309: Group ungrouped now contains managed_node2 11044 1726853235.45310: Group ungrouped now contains managed_node3 11044 1726853235.45390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 11044 1726853235.45522: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 11044 1726853235.45569: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 11044 1726853235.45604: Loaded config def from plugin (vars/host_group_vars) 11044 1726853235.45607: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 11044 1726853235.45615: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 11044 1726853235.45623: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 11044 1726853235.45669: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 11044 1726853235.46038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853235.46148: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 11044 1726853235.46191: Loaded config def from plugin (connection/local) 11044 1726853235.46194: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 11044 1726853235.46703: Loaded config def from plugin (connection/paramiko_ssh) 11044 1726853235.46705: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 11044 1726853235.47274: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11044 1726853235.47297: Loaded config def from plugin (connection/psrp) 11044 1726853235.47299: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 11044 1726853235.47705: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11044 1726853235.47727: Loaded config def from plugin (connection/ssh) 11044 1726853235.47729: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 11044 1726853235.49392: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11044 1726853235.49432: Loaded config def from plugin (connection/winrm) 11044 1726853235.49435: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 11044 1726853235.49468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 11044 1726853235.49530: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 11044 1726853235.49599: Loaded config def from plugin (shell/cmd) 11044 1726853235.49601: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 11044 1726853235.49627: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 11044 1726853235.49697: Loaded config def from plugin (shell/powershell) 11044 1726853235.49699: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 11044 1726853235.49753: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 11044 1726853235.49904: Loaded config def from plugin (shell/sh) 11044 1726853235.49905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 11044 1726853235.49927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 11044 1726853235.50004: Loaded config def from plugin (become/runas) 11044 1726853235.50005: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 11044 1726853235.50125: Loaded config def from plugin (become/su) 11044 1726853235.50127: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 11044 1726853235.50223: Loaded config def from plugin (become/sudo) 11044 1726853235.50224: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 11044 1726853235.50248: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml 11044 1726853235.50466: in VariableManager get_vars() 11044 1726853235.50483: done with get_vars() 11044 1726853235.50573: trying /usr/local/lib/python3.12/site-packages/ansible/modules 11044 1726853235.53106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 11044 1726853235.53221: in VariableManager get_vars() 11044 1726853235.53226: done with get_vars() 11044 1726853235.53228: variable 'playbook_dir' from source: magic vars 11044 1726853235.53229: variable 'ansible_playbook_python' from source: magic vars 11044 1726853235.53230: variable 'ansible_config_file' from source: magic vars 11044 1726853235.53231: variable 'groups' from source: magic vars 11044 1726853235.53232: variable 'omit' from source: magic vars 11044 1726853235.53232: variable 'ansible_version' from source: magic vars 11044 1726853235.53233: variable 'ansible_check_mode' from source: magic vars 11044 1726853235.53234: variable 'ansible_diff_mode' from source: magic vars 11044 1726853235.53235: variable 'ansible_forks' from source: magic vars 11044 1726853235.53235: variable 'ansible_inventory_sources' from source: magic vars 11044 1726853235.53236: variable 'ansible_skip_tags' from source: magic vars 11044 1726853235.53237: variable 'ansible_limit' from source: magic vars 11044 1726853235.53237: variable 'ansible_run_tags' from source: magic vars 11044 1726853235.53238: variable 'ansible_verbosity' from source: magic vars 11044 1726853235.53277: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml 11044 1726853235.53848: in VariableManager get_vars() 11044 1726853235.53859: done with get_vars() 11044 1726853235.53865: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11044 1726853235.54450: in VariableManager get_vars() 11044 1726853235.54459: done with get_vars() 11044 1726853235.54465: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11044 1726853235.54534: in VariableManager get_vars() 11044 1726853235.54552: done with get_vars() 11044 1726853235.54640: in VariableManager get_vars() 11044 1726853235.54649: done with get_vars() 11044 1726853235.54655: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11044 1726853235.54701: in VariableManager get_vars() 11044 1726853235.54711: done with get_vars() 11044 1726853235.54897: in VariableManager get_vars() 11044 1726853235.54906: done with get_vars() 11044 1726853235.54909: variable 'omit' from source: magic vars 11044 1726853235.54921: variable 'omit' from source: magic vars 11044 1726853235.54942: in VariableManager get_vars() 11044 1726853235.54954: done with get_vars() 11044 1726853235.54985: in VariableManager get_vars() 11044 1726853235.54993: done with get_vars() 11044 1726853235.55015: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11044 1726853235.55145: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11044 1726853235.55223: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11044 1726853235.55713: in VariableManager get_vars() 11044 1726853235.55732: done with get_vars() 11044 1726853235.56118: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 11044 1726853235.56249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11044 1726853235.57336: in VariableManager get_vars() 11044 1726853235.57347: done with get_vars() 11044 1726853235.57354: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11044 1726853235.57466: in VariableManager get_vars() 11044 1726853235.57480: done with get_vars() 11044 1726853235.57553: in VariableManager get_vars() 11044 1726853235.57563: done with get_vars() 11044 1726853235.57798: in VariableManager get_vars() 11044 1726853235.57809: done with get_vars() 11044 1726853235.57812: variable 'omit' from source: magic vars 11044 1726853235.57827: variable 'omit' from source: magic vars 11044 1726853235.57849: in VariableManager get_vars() 11044 1726853235.57858: done with get_vars() 11044 1726853235.57870: in VariableManager get_vars() 11044 1726853235.57882: done with get_vars() 11044 1726853235.57902: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11044 1726853235.57962: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11044 1726853235.59208: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11044 1726853235.59431: in VariableManager get_vars() 11044 1726853235.59446: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11044 1726853235.60695: in VariableManager get_vars() 11044 1726853235.60709: done with get_vars() 11044 1726853235.60714: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11044 1726853235.61029: in VariableManager get_vars() 11044 1726853235.61041: done with get_vars() 11044 1726853235.61083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 11044 1726853235.61091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 11044 1726853235.61243: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 11044 1726853235.61336: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 11044 1726853235.61338: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 11044 1726853235.61359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 11044 1726853235.61377: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 11044 1726853235.61475: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 11044 1726853235.61513: Loaded config def from plugin (callback/default) 11044 1726853235.61515: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11044 1726853235.62257: Loaded config def from plugin (callback/junit) 11044 1726853235.62259: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11044 1726853235.62290: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 11044 1726853235.62326: Loaded config def from plugin (callback/minimal) 11044 1726853235.62327: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11044 1726853235.62353: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11044 1726853235.62394: Loaded config def from plugin (callback/tree) 11044 1726853235.62396: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 11044 1726853235.62464: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 11044 1726853235.62465: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_deprecated_nm.yml ***************************************** 2 plays in /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml 11044 1726853235.62487: in VariableManager get_vars() 11044 1726853235.62495: done with get_vars() 11044 1726853235.62498: in VariableManager get_vars() 11044 1726853235.62503: done with get_vars() 11044 1726853235.62506: variable 'omit' from source: magic vars 11044 1726853235.62527: in VariableManager get_vars() 11044 1726853235.62535: done with get_vars() 11044 1726853235.62547: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_deprecated.yml' with nm as provider] *** 11044 1726853235.62913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 11044 1726853235.62961: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 11044 1726853235.62989: getting the remaining hosts for this loop 11044 1726853235.62990: done getting the remaining hosts for this loop 11044 1726853235.62992: getting the next task for host managed_node1 11044 1726853235.62994: done getting next task for host managed_node1 11044 1726853235.62995: ^ task is: TASK: Gathering Facts 11044 1726853235.62996: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853235.62998: getting variables 11044 1726853235.62998: in VariableManager get_vars() 11044 1726853235.63005: Calling all_inventory to load vars for managed_node1 11044 1726853235.63006: Calling groups_inventory to load vars for managed_node1 11044 1726853235.63008: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853235.63017: Calling all_plugins_play to load vars for managed_node1 11044 1726853235.63026: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853235.63028: Calling groups_plugins_play to load vars for managed_node1 11044 1726853235.63050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853235.63088: done with get_vars() 11044 1726853235.63092: done getting variables 11044 1726853235.63137: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:6 Friday 20 September 2024 13:27:15 -0400 (0:00:00.007) 0:00:00.007 ****** 11044 1726853235.63152: entering _queue_task() for managed_node1/gather_facts 11044 1726853235.63153: Creating lock for gather_facts 11044 1726853235.63435: worker is 1 (out of 1 available) 11044 1726853235.63445: exiting _queue_task() for managed_node1/gather_facts 11044 1726853235.63459: done queuing things up, now waiting for results queue to drain 11044 1726853235.63461: waiting for pending results... 11044 1726853235.63606: running TaskExecutor() for managed_node1/TASK: Gathering Facts 11044 1726853235.63654: in run() - task 02083763-bbaf-c5a6-f857-0000000000cd 11044 1726853235.63667: variable 'ansible_search_path' from source: unknown 11044 1726853235.63699: calling self._execute() 11044 1726853235.63751: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853235.63757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853235.63765: variable 'omit' from source: magic vars 11044 1726853235.63838: variable 'omit' from source: magic vars 11044 1726853235.63859: variable 'omit' from source: magic vars 11044 1726853235.63885: variable 'omit' from source: magic vars 11044 1726853235.63921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853235.63981: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853235.63996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853235.64010: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853235.64021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853235.64043: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853235.64046: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853235.64052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853235.64120: Set connection var ansible_timeout to 10 11044 1726853235.64131: Set connection var ansible_shell_executable to /bin/sh 11044 1726853235.64134: Set connection var ansible_shell_type to sh 11044 1726853235.64137: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853235.64139: Set connection var ansible_connection to ssh 11044 1726853235.64143: Set connection var ansible_pipelining to False 11044 1726853235.64163: variable 'ansible_shell_executable' from source: unknown 11044 1726853235.64167: variable 'ansible_connection' from source: unknown 11044 1726853235.64169: variable 'ansible_module_compression' from source: unknown 11044 1726853235.64173: variable 'ansible_shell_type' from source: unknown 11044 1726853235.64175: variable 'ansible_shell_executable' from source: unknown 11044 1726853235.64178: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853235.64180: variable 'ansible_pipelining' from source: unknown 11044 1726853235.64184: variable 'ansible_timeout' from source: unknown 11044 1726853235.64188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853235.64324: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853235.64330: variable 'omit' from source: magic vars 11044 1726853235.64333: starting attempt loop 11044 1726853235.64336: running the handler 11044 1726853235.64354: variable 'ansible_facts' from source: unknown 11044 1726853235.64368: _low_level_execute_command(): starting 11044 1726853235.64377: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853235.64877: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853235.64882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853235.64902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853235.64953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853235.64956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853235.64960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853235.65011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853235.66686: stdout chunk (state=3): >>>/root <<< 11044 1726853235.66782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853235.66812: stderr chunk (state=3): >>><<< 11044 1726853235.66816: stdout chunk (state=3): >>><<< 11044 1726853235.66837: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853235.66876: _low_level_execute_command(): starting 11044 1726853235.66880: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853235.6683707-11081-82751601766332 `" && echo ansible-tmp-1726853235.6683707-11081-82751601766332="` echo /root/.ansible/tmp/ansible-tmp-1726853235.6683707-11081-82751601766332 `" ) && sleep 0' 11044 1726853235.67296: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853235.67299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853235.67301: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 11044 1726853235.67303: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853235.67312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853235.67358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853235.67368: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853235.67373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853235.67404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853235.69270: stdout chunk (state=3): >>>ansible-tmp-1726853235.6683707-11081-82751601766332=/root/.ansible/tmp/ansible-tmp-1726853235.6683707-11081-82751601766332 <<< 11044 1726853235.69382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853235.69411: stderr chunk (state=3): >>><<< 11044 1726853235.69415: stdout chunk (state=3): >>><<< 11044 1726853235.69431: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853235.6683707-11081-82751601766332=/root/.ansible/tmp/ansible-tmp-1726853235.6683707-11081-82751601766332 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853235.69458: variable 'ansible_module_compression' from source: unknown 11044 1726853235.69498: ANSIBALLZ: Using generic lock for ansible.legacy.setup 11044 1726853235.69503: ANSIBALLZ: Acquiring lock 11044 1726853235.69506: ANSIBALLZ: Lock acquired: 140360202229168 11044 1726853235.69508: ANSIBALLZ: Creating module 11044 1726853235.90377: ANSIBALLZ: Writing module into payload 11044 1726853235.90436: ANSIBALLZ: Writing module 11044 1726853235.90463: ANSIBALLZ: Renaming module 11044 1726853235.90479: ANSIBALLZ: Done creating module 11044 1726853235.90506: variable 'ansible_facts' from source: unknown 11044 1726853235.90519: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853235.90534: _low_level_execute_command(): starting 11044 1726853235.90545: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 11044 1726853235.91194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853235.91286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853235.91353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853235.93038: stdout chunk (state=3): >>>PLATFORM <<< 11044 1726853235.93114: stdout chunk (state=3): >>>Linux <<< 11044 1726853235.93139: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 11044 1726853235.93142: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 11044 1726853235.93364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853235.93368: stdout chunk (state=3): >>><<< 11044 1726853235.93370: stderr chunk (state=3): >>><<< 11044 1726853235.93388: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853235.93404 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 11044 1726853235.93535: _low_level_execute_command(): starting 11044 1726853235.93539: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 11044 1726853235.93665: Sending initial data 11044 1726853235.93669: Sent initial data (1181 bytes) 11044 1726853235.94122: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853235.94125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853235.94127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853235.94129: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853235.94131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853235.94137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853235.94194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853235.94198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853235.94249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853235.97631: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 11044 1726853235.97982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853235.98010: stderr chunk (state=3): >>><<< 11044 1726853235.98013: stdout chunk (state=3): >>><<< 11044 1726853235.98025: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853235.98087: variable 'ansible_facts' from source: unknown 11044 1726853235.98090: variable 'ansible_facts' from source: unknown 11044 1726853235.98098: variable 'ansible_module_compression' from source: unknown 11044 1726853235.98129: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11044 1726853235.98154: variable 'ansible_facts' from source: unknown 11044 1726853235.98255: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853235.6683707-11081-82751601766332/AnsiballZ_setup.py 11044 1726853235.98468: Sending initial data 11044 1726853235.98473: Sent initial data (153 bytes) 11044 1726853235.99091: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853235.99110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853235.99122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853235.99183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853236.00723: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11044 1726853236.00733: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853236.00767: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853236.00810: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpucnmknfa /root/.ansible/tmp/ansible-tmp-1726853235.6683707-11081-82751601766332/AnsiballZ_setup.py <<< 11044 1726853236.00814: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853235.6683707-11081-82751601766332/AnsiballZ_setup.py" <<< 11044 1726853236.00842: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpucnmknfa" to remote "/root/.ansible/tmp/ansible-tmp-1726853235.6683707-11081-82751601766332/AnsiballZ_setup.py" <<< 11044 1726853236.00848: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853235.6683707-11081-82751601766332/AnsiballZ_setup.py" <<< 11044 1726853236.01898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853236.01943: stderr chunk (state=3): >>><<< 11044 1726853236.01949: stdout chunk (state=3): >>><<< 11044 1726853236.01963: done transferring module to remote 11044 1726853236.01978: _low_level_execute_command(): starting 11044 1726853236.01982: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853235.6683707-11081-82751601766332/ /root/.ansible/tmp/ansible-tmp-1726853235.6683707-11081-82751601766332/AnsiballZ_setup.py && sleep 0' 11044 1726853236.02431: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853236.02434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853236.02436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853236.02442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853236.02447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853236.02493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853236.02497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853236.02542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853236.04292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853236.04316: stderr chunk (state=3): >>><<< 11044 1726853236.04321: stdout chunk (state=3): >>><<< 11044 1726853236.04334: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853236.04337: _low_level_execute_command(): starting 11044 1726853236.04342: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853235.6683707-11081-82751601766332/AnsiballZ_setup.py && sleep 0' 11044 1726853236.04753: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853236.04756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853236.04758: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 11044 1726853236.04761: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853236.04810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853236.04814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853236.04861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853236.07006: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 11044 1726853236.07048: stdout chunk (state=3): >>>import _imp # builtin <<< 11044 1726853236.07076: stdout chunk (state=3): >>>import '_thread' # <<< 11044 1726853236.07087: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 11044 1726853236.07134: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 11044 1726853236.07169: stdout chunk (state=3): >>>import 'posix' # <<< 11044 1726853236.07207: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 11044 1726853236.07227: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 11044 1726853236.07289: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 11044 1726853236.07310: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 11044 1726853236.07328: stdout chunk (state=3): >>>import 'codecs' # <<< 11044 1726853236.07363: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11044 1726853236.07394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 11044 1726853236.07427: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84895bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848958bb00> <<< 11044 1726853236.07442: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84895bea50> import '_signal' # <<< 11044 1726853236.07469: stdout chunk (state=3): >>>import '_abc' # <<< 11044 1726853236.07490: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 11044 1726853236.07523: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 11044 1726853236.07604: stdout chunk (state=3): >>>import '_collections_abc' # <<< 11044 1726853236.07631: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 11044 1726853236.07667: stdout chunk (state=3): >>>import 'os' # <<< 11044 1726853236.07701: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages <<< 11044 1726853236.07720: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 11044 1726853236.07754: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 11044 1726853236.07773: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84895cd130> <<< 11044 1726853236.07832: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 11044 1726853236.07835: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853236.07861: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84895cdfa0> import 'site' # <<< 11044 1726853236.07895: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11044 1726853236.08291: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11044 1726853236.08295: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 11044 1726853236.08330: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853236.08334: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 11044 1726853236.08366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11044 1726853236.08412: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 11044 1726853236.08416: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 11044 1726853236.08472: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893abdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 11044 1726853236.08499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 11044 1726853236.08502: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893abfe0> <<< 11044 1726853236.08523: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11044 1726853236.08549: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 11044 1726853236.08565: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 11044 1726853236.08603: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853236.08630: stdout chunk (state=3): >>>import 'itertools' # <<< 11044 1726853236.08666: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893e3800> <<< 11044 1726853236.08688: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893e3e90> <<< 11044 1726853236.08699: stdout chunk (state=3): >>>import '_collections' # <<< 11044 1726853236.08737: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893c3aa0> <<< 11044 1726853236.08747: stdout chunk (state=3): >>>import '_functools' # <<< 11044 1726853236.08773: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893c11c0> <<< 11044 1726853236.08948: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893a8f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 11044 1726853236.08970: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 11044 1726853236.08987: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 11044 1726853236.09018: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489403770> <<< 11044 1726853236.09046: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489402390> <<< 11044 1726853236.09058: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 11044 1726853236.09084: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893c2090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489400ad0> <<< 11044 1726853236.09119: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 11044 1726853236.09133: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489438800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893a8200> <<< 11044 1726853236.09160: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 11044 1726853236.09191: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 11044 1726853236.09204: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8489438cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489438b60> <<< 11044 1726853236.09236: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8489438ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893a6d20> <<< 11044 1726853236.09274: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853236.09291: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 11044 1726853236.09325: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 11044 1726853236.09341: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489439550> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489439220> import 'importlib.machinery' # <<< 11044 1726853236.09375: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 11044 1726853236.09402: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848943a450> <<< 11044 1726853236.09415: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 11044 1726853236.09433: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11044 1726853236.09466: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 11044 1726853236.09502: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489450680> <<< 11044 1726853236.09505: stdout chunk (state=3): >>>import 'errno' # <<< 11044 1726853236.09540: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8489451d60> <<< 11044 1726853236.09560: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 11044 1726853236.09607: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 11044 1726853236.09611: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489452c00> <<< 11044 1726853236.09661: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8489453260> <<< 11044 1726853236.09668: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489452150> <<< 11044 1726853236.09695: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11044 1726853236.09735: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 11044 1726853236.09750: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8489453ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489453410> <<< 11044 1726853236.09798: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848943a4b0> <<< 11044 1726853236.09816: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 11044 1726853236.09861: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 11044 1726853236.09868: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11044 1726853236.09947: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8489153bc0> <<< 11044 1726853236.09994: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 11044 1726853236.10012: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f848917c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848917c410> <<< 11044 1726853236.10015: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f848917c6e0> <<< 11044 1726853236.10017: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11044 1726853236.10081: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11044 1726853236.10208: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f848917d010> <<< 11044 1726853236.10422: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f848917d9d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848917c8c0> <<< 11044 1726853236.10647: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489151d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848917edb0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848917daf0> <<< 11044 1726853236.10680: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848943aba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84891ab140> <<< 11044 1726853236.10683: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 11044 1726853236.10695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853236.10708: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 11044 1726853236.10728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 11044 1726853236.10768: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84891cb500> <<< 11044 1726853236.10789: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11044 1726853236.10829: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11044 1726853236.10892: stdout chunk (state=3): >>>import 'ntpath' # <<< 11044 1726853236.10930: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848922c290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 11044 1726853236.10952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11044 1726853236.10980: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11044 1726853236.11013: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11044 1726853236.11099: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848922e9f0> <<< 11044 1726853236.11168: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848922c3b0> <<< 11044 1726853236.11203: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84891f12b0> <<< 11044 1726853236.11247: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488b253a0> <<< 11044 1726853236.11262: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84891ca300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848917fd10> <<< 11044 1726853236.11446: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11044 1726853236.11459: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f84891ca660> <<< 11044 1726853236.11734: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_j0dwghel/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 11044 1726853236.11867: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.11897: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11044 1726853236.11948: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11044 1726853236.12030: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 11044 1726853236.12066: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488b8b0b0> <<< 11044 1726853236.12069: stdout chunk (state=3): >>>import '_typing' # <<< 11044 1726853236.12257: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488b69fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488b69100> # zipimport: zlib available <<< 11044 1726853236.12300: stdout chunk (state=3): >>>import 'ansible' # <<< 11044 1726853236.12319: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11044 1726853236.12353: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 11044 1726853236.12458: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.13780: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.14935: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488b88f50> <<< 11044 1726853236.14970: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853236.15002: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 11044 1726853236.15226: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488bba960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488bba720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488bba030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488bba480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488b8bad0> import 'atexit' # <<< 11044 1726853236.15241: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488bbb6b0> <<< 11044 1726853236.15268: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488bbb800> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11044 1726853236.15332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 11044 1726853236.15335: stdout chunk (state=3): >>>import '_locale' # <<< 11044 1726853236.15379: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488bbbd10> <<< 11044 1726853236.15397: stdout chunk (state=3): >>>import 'pwd' # <<< 11044 1726853236.15431: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11044 1726853236.15477: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a25a90> <<< 11044 1726853236.15648: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488a276b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 11044 1726853236.15751: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a27f80> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a291f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 11044 1726853236.15777: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a2bce0> <<< 11044 1726853236.16010: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84893a6cc0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a29fa0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11044 1726853236.16095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a33d40> import '_tokenize' # <<< 11044 1726853236.16157: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a32810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a32570> <<< 11044 1726853236.16184: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11044 1726853236.16336: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a32ae0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a2a4b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488a77fb0> <<< 11044 1726853236.16359: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a78080> <<< 11044 1726853236.16456: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 11044 1726853236.16582: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488a79b50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a79910> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11044 1726853236.16585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 11044 1726853236.16613: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488a7c0e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a7a240> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11044 1726853236.16661: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853236.16674: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 11044 1726853236.16705: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a7f890> <<< 11044 1726853236.17276: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a7c260> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488a808f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488a80a70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488a80a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a78230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84889081d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 11044 1726853236.17281: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488909490> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a82960> <<< 11044 1726853236.17352: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488a83d10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a825d0> <<< 11044 1726853236.17356: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.17358: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 11044 1726853236.17377: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.17460: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.17632: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 11044 1726853236.17726: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.17888: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.18394: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.18936: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 11044 1726853236.18962: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 11044 1726853236.18994: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 11044 1726853236.19006: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853236.19052: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84889114c0> <<< 11044 1726853236.19388: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488912180> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889095b0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 11044 1726853236.19392: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 11044 1726853236.19488: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.19567: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 11044 1726853236.19593: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889121b0> # zipimport: zlib available <<< 11044 1726853236.20039: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.20559: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11044 1726853236.20678: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available <<< 11044 1726853236.20722: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 11044 1726853236.20747: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.20796: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.20917: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 11044 1726853236.20920: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 11044 1726853236.21027: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 11044 1726853236.21243: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.21543: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11044 1726853236.21911: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889132c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 11044 1726853236.21932: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.21979: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.22036: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.22106: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11044 1726853236.22143: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853236.22228: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 11044 1726853236.22333: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f848891dcd0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848891b080> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 11044 1726853236.22377: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.22437: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.22465: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.22509: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853236.22546: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 11044 1726853236.22583: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 11044 1726853236.22668: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 11044 1726853236.22699: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11044 1726853236.22734: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a066c0> <<< 11044 1726853236.22859: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488afe390> <<< 11044 1726853236.22879: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848891ddf0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889107d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 11044 1726853236.23008: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.23011: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 11044 1726853236.23026: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 11044 1726853236.23041: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.23093: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.23359: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11044 1726853236.23384: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 11044 1726853236.23436: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.23523: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.23648: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.23651: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 11044 1726853236.23748: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.23915: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.23953: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.24009: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853236.24036: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 11044 1726853236.24069: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 11044 1726853236.24196: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 11044 1726853236.24227: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889b1d90> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 11044 1726853236.24241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848857fc20> <<< 11044 1726853236.24284: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 11044 1726853236.24458: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f848857ff80> <<< 11044 1726853236.24482: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889b3350> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889b28d0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889b0560> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889b1dc0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 11044 1726853236.24511: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 11044 1726853236.24550: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 11044 1726853236.24594: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488596fc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488596870> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 11044 1726853236.24733: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488596a50> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488595ca0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 11044 1726853236.24763: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488597170> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 11044 1726853236.24790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 11044 1726853236.24892: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84885edc70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488597c50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889b01a0> import 'ansible.module_utils.facts.timeout' # <<< 11044 1726853236.24970: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 11044 1726853236.24976: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11044 1726853236.24980: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # <<< 11044 1726853236.25057: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11044 1726853236.25061: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 11044 1726853236.25077: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.25113: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.25181: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 11044 1726853236.25227: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 11044 1726853236.25309: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11044 1726853236.25314: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 11044 1726853236.25364: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.25378: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 11044 1726853236.25577: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.25580: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 11044 1726853236.25582: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11044 1726853236.25633: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.25686: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 11044 1726853236.25938: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 11044 1726853236.26174: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.26610: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 11044 1726853236.26654: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.26720: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.26753: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.26784: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 11044 1726853236.26827: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.26863: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 11044 1726853236.26933: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.26975: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 11044 1726853236.27006: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.27041: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 11044 1726853236.27074: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.27103: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 11044 1726853236.27188: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.27274: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 11044 1726853236.27383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84885efce0> <<< 11044 1726853236.27387: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 11044 1726853236.27462: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84885ee8d0> <<< 11044 1726853236.27495: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 11044 1726853236.27546: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.27784: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 11044 1726853236.27787: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.28085: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 11044 1726853236.28088: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 11044 1726853236.28091: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.28128: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 11044 1726853236.28134: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 11044 1726853236.28234: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f848861e090> <<< 11044 1726853236.28372: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848860ef60> <<< 11044 1726853236.28381: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 11044 1726853236.28387: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.28485: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.28533: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 11044 1726853236.28674: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11044 1726853236.28885: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.28920: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 11044 1726853236.28927: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.28965: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.29008: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 11044 1726853236.29057: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.29104: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 11044 1726853236.29120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 11044 1726853236.29145: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 11044 1726853236.29149: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488635ca0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848860f0e0> <<< 11044 1726853236.29436: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 11044 1726853236.29455: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.29591: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 11044 1726853236.29598: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.29787: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.29793: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.29880: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.29884: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 11044 1726853236.30062: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11044 1726853236.30069: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.30496: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 11044 1726853236.30499: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 11044 1726853236.30502: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.30534: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11044 1726853236.31574: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.31592: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 11044 1726853236.31598: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.31778: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.31801: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 11044 1726853236.31905: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.32002: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 11044 1726853236.32008: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.32284: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.32309: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 11044 1726853236.32328: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.32332: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 11044 1726853236.32357: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.32391: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.32461: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available<<< 11044 1726853236.32464: stdout chunk (state=3): >>> <<< 11044 1726853236.32683: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11044 1726853236.32877: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.33035: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 11044 1726853236.33043: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.33083: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.33120: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 11044 1726853236.33126: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.33153: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.33179: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 11044 1726853236.33185: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.33586: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 11044 1726853236.33612: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 11044 1726853236.33619: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.33880: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.34135: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available<<< 11044 1726853236.34141: stdout chunk (state=3): >>> <<< 11044 1726853236.34259: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.34266: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 11044 1726853236.34297: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.34367: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 11044 1726853236.34375: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.34406: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 11044 1726853236.34412: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.34446: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.34903: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11044 1726853236.34910: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.34988: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.35068: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 11044 1726853236.35077: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 11044 1726853236.35139: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.35169: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # <<< 11044 1726853236.35177: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.35374: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.35553: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 11044 1726853236.35583: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.35784: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 11044 1726853236.35908: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.35932: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 11044 1726853236.36020: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.36107: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 11044 1726853236.36116: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 11044 1726853236.36201: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853236.36530: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 11044 1726853236.36535: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 11044 1726853236.36569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 11044 1726853236.36578: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84883ca720> <<< 11044 1726853236.36641: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84883cb110> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84883c1f70> <<< 11044 1726853236.52145: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 11044 1726853236.52150: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488411250> <<< 11044 1726853236.52216: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 11044 1726853236.52269: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488411f70> <<< 11044 1726853236.52276: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 11044 1726853236.52342: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 11044 1726853236.52350: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848845c710> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848845c200> <<< 11044 1726853236.52599: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 11044 1726853236.72802: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_nu<<< 11044 1726853236.72849: stdout chunk (state=3): >>>mber": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "27", "second": "16", "epoch": "1726853236", "epoch_int": "1726853236", "date": "2024-09-20", "time": "13:27:16", "iso8601_micro": "2024-09-20T17:27:16.376009Z", "iso8601": "2024-09-20T17:27:16Z", "iso8601_basic": "20240920T132716376009", "iso8601_basic_short": "20240920T132716", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2959, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 572, "free": 2959}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 402, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261793972224, "block_size": 4096, "block_total": 65519099, "block_available": 63914544, "block_used": 1604555, "inode_total": 131070960, "inode_available": 131029088, "inode_used": 41872, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.4326171875, "5m": 0.22265625, "15m": 0.09228515625}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11044 1726853236.73462: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc <<< 11044 1726853236.73515: stdout chunk (state=3): >>># clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath <<< 11044 1726853236.73536: stdout chunk (state=3): >>># cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 <<< 11044 1726853236.73574: stdout chunk (state=3): >>># cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket <<< 11044 1726853236.73588: stdout chunk (state=3): >>># cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon <<< 11044 1726853236.73620: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro <<< 11044 1726853236.73669: stdout chunk (state=3): >>># cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue <<< 11044 1726853236.73686: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly <<< 11044 1726853236.73726: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn<<< 11044 1726853236.73740: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 11044 1726853236.74064: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11044 1726853236.74105: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 11044 1726853236.74116: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 11044 1726853236.74146: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 11044 1726853236.74186: stdout chunk (state=3): >>># destroy ntpath <<< 11044 1726853236.74217: stdout chunk (state=3): >>># destroy importlib <<< 11044 1726853236.74245: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 11044 1726853236.74261: stdout chunk (state=3): >>># destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal <<< 11044 1726853236.74300: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid <<< 11044 1726853236.74303: stdout chunk (state=3): >>># destroy selinux <<< 11044 1726853236.74333: stdout chunk (state=3): >>># destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 11044 1726853236.74364: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 11044 1726853236.74382: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle<<< 11044 1726853236.74415: stdout chunk (state=3): >>> # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 11044 1726853236.74444: stdout chunk (state=3): >>># destroy shlex # destroy fcntl <<< 11044 1726853236.74465: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64 <<< 11044 1726853236.74500: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 11044 1726853236.74519: stdout chunk (state=3): >>># destroy json # destroy socket # destroy struct <<< 11044 1726853236.74552: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection <<< 11044 1726853236.74584: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 11044 1726853236.74616: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 11044 1726853236.74639: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib <<< 11044 1726853236.74677: stdout chunk (state=3): >>># cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 11044 1726853236.74738: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc <<< 11044 1726853236.74741: stdout chunk (state=3): >>># cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases <<< 11044 1726853236.74765: stdout chunk (state=3): >>># cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11044 1726853236.74906: stdout chunk (state=3): >>># destroy sys.monitoring <<< 11044 1726853236.74945: stdout chunk (state=3): >>># destroy _socket # destroy _collections # destroy platform <<< 11044 1726853236.74970: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 11044 1726853236.74987: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 11044 1726853236.75019: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 11044 1726853236.75053: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11044 1726853236.75179: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 11044 1726853236.75185: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 11044 1726853236.75238: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools <<< 11044 1726853236.75246: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11044 1726853236.76036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853236.76039: stdout chunk (state=3): >>><<< 11044 1726853236.76049: stderr chunk (state=3): >>><<< 11044 1726853236.76299: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84895bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848958bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84895bea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84895cd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84895cdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893abdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893abfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893e3800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893e3e90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893c3aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893c11c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893a8f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489403770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489402390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893c2090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489400ad0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489438800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893a8200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8489438cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489438b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8489438ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84893a6d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489439550> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489439220> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848943a450> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489450680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8489451d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489452c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8489453260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489452150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8489453ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489453410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848943a4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8489153bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f848917c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848917c410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f848917c6e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f848917d010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f848917d9d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848917c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8489151d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848917edb0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848917daf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848943aba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84891ab140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84891cb500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848922c290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848922e9f0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848922c3b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84891f12b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488b253a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84891ca300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848917fd10> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f84891ca660> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_j0dwghel/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488b8b0b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488b69fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488b69100> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488b88f50> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488bba960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488bba720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488bba030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488bba480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488b8bad0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488bbb6b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488bbb800> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488bbbd10> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a25a90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488a276b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a27f80> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a291f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a2bce0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84893a6cc0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a29fa0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a33d40> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a32810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a32570> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a32ae0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a2a4b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488a77fb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a78080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488a79b50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a79910> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488a7c0e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a7a240> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a7f890> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a7c260> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488a808f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488a80a70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488a80a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a78230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84889081d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488909490> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a82960> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488a83d10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a825d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84889114c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488912180> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889095b0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889121b0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889132c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f848891dcd0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848891b080> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488a066c0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488afe390> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848891ddf0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889107d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889b1d90> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848857fc20> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f848857ff80> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889b3350> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889b28d0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889b0560> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889b1dc0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488596fc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488596870> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488596a50> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488595ca0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488597170> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84885edc70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488597c50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84889b01a0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84885efce0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84885ee8d0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f848861e090> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848860ef60> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8488635ca0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848860f0e0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f84883ca720> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84883cb110> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f84883c1f70> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488411250> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8488411f70> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848845c710> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f848845c200> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "27", "second": "16", "epoch": "1726853236", "epoch_int": "1726853236", "date": "2024-09-20", "time": "13:27:16", "iso8601_micro": "2024-09-20T17:27:16.376009Z", "iso8601": "2024-09-20T17:27:16Z", "iso8601_basic": "20240920T132716376009", "iso8601_basic_short": "20240920T132716", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2959, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 572, "free": 2959}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 402, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261793972224, "block_size": 4096, "block_total": 65519099, "block_available": 63914544, "block_used": 1604555, "inode_total": 131070960, "inode_available": 131029088, "inode_used": 41872, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.4326171875, "5m": 0.22265625, "15m": 0.09228515625}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 11044 1726853236.79325: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853235.6683707-11081-82751601766332/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853236.79340: _low_level_execute_command(): starting 11044 1726853236.79348: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853235.6683707-11081-82751601766332/ > /dev/null 2>&1 && sleep 0' 11044 1726853236.81083: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853236.81101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853236.81119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853236.81136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853236.81154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853236.81243: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853236.81356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853236.81626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853236.81678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853236.83771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853236.83776: stdout chunk (state=3): >>><<< 11044 1726853236.83778: stderr chunk (state=3): >>><<< 11044 1726853236.83782: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853236.83784: handler run complete 11044 1726853236.84107: variable 'ansible_facts' from source: unknown 11044 1726853236.84490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853236.85452: variable 'ansible_facts' from source: unknown 11044 1726853236.85690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853236.86513: attempt loop complete, returning result 11044 1726853236.86516: _execute() done 11044 1726853236.86519: dumping result to json 11044 1726853236.86521: done dumping result, returning 11044 1726853236.86523: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-c5a6-f857-0000000000cd] 11044 1726853236.86525: sending task result for task 02083763-bbaf-c5a6-f857-0000000000cd 11044 1726853236.87722: done sending task result for task 02083763-bbaf-c5a6-f857-0000000000cd 11044 1726853236.87725: WORKER PROCESS EXITING ok: [managed_node1] 11044 1726853236.88413: no more pending results, returning what we have 11044 1726853236.88417: results queue empty 11044 1726853236.88418: checking for any_errors_fatal 11044 1726853236.88419: done checking for any_errors_fatal 11044 1726853236.88420: checking for max_fail_percentage 11044 1726853236.88421: done checking for max_fail_percentage 11044 1726853236.88422: checking to see if all hosts have failed and the running result is not ok 11044 1726853236.88423: done checking to see if all hosts have failed 11044 1726853236.88423: getting the remaining hosts for this loop 11044 1726853236.88425: done getting the remaining hosts for this loop 11044 1726853236.88428: getting the next task for host managed_node1 11044 1726853236.88434: done getting next task for host managed_node1 11044 1726853236.88436: ^ task is: TASK: meta (flush_handlers) 11044 1726853236.88437: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853236.88441: getting variables 11044 1726853236.88442: in VariableManager get_vars() 11044 1726853236.88463: Calling all_inventory to load vars for managed_node1 11044 1726853236.88466: Calling groups_inventory to load vars for managed_node1 11044 1726853236.88469: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853236.88505: Calling all_plugins_play to load vars for managed_node1 11044 1726853236.88509: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853236.88512: Calling groups_plugins_play to load vars for managed_node1 11044 1726853236.88986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853236.89389: done with get_vars() 11044 1726853236.89399: done getting variables 11044 1726853236.89467: in VariableManager get_vars() 11044 1726853236.89478: Calling all_inventory to load vars for managed_node1 11044 1726853236.89543: Calling groups_inventory to load vars for managed_node1 11044 1726853236.89549: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853236.89554: Calling all_plugins_play to load vars for managed_node1 11044 1726853236.89556: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853236.89559: Calling groups_plugins_play to load vars for managed_node1 11044 1726853236.90103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853236.90643: done with get_vars() 11044 1726853236.90659: done queuing things up, now waiting for results queue to drain 11044 1726853236.90661: results queue empty 11044 1726853236.90662: checking for any_errors_fatal 11044 1726853236.90665: done checking for any_errors_fatal 11044 1726853236.90670: checking for max_fail_percentage 11044 1726853236.90880: done checking for max_fail_percentage 11044 1726853236.90881: checking to see if all hosts have failed and the running result is not ok 11044 1726853236.90882: done checking to see if all hosts have failed 11044 1726853236.90883: getting the remaining hosts for this loop 11044 1726853236.90884: done getting the remaining hosts for this loop 11044 1726853236.90887: getting the next task for host managed_node1 11044 1726853236.90891: done getting next task for host managed_node1 11044 1726853236.90894: ^ task is: TASK: Include the task 'el_repo_setup.yml' 11044 1726853236.90896: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853236.90898: getting variables 11044 1726853236.90899: in VariableManager get_vars() 11044 1726853236.90907: Calling all_inventory to load vars for managed_node1 11044 1726853236.90909: Calling groups_inventory to load vars for managed_node1 11044 1726853236.90911: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853236.90916: Calling all_plugins_play to load vars for managed_node1 11044 1726853236.90918: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853236.90921: Calling groups_plugins_play to load vars for managed_node1 11044 1726853236.91155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853236.91678: done with get_vars() 11044 1726853236.91685: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:11 Friday 20 September 2024 13:27:16 -0400 (0:00:01.287) 0:00:01.294 ****** 11044 1726853236.91876: entering _queue_task() for managed_node1/include_tasks 11044 1726853236.91878: Creating lock for include_tasks 11044 1726853236.92554: worker is 1 (out of 1 available) 11044 1726853236.92565: exiting _queue_task() for managed_node1/include_tasks 11044 1726853236.92579: done queuing things up, now waiting for results queue to drain 11044 1726853236.92581: waiting for pending results... 11044 1726853236.93068: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 11044 1726853236.93426: in run() - task 02083763-bbaf-c5a6-f857-000000000006 11044 1726853236.93429: variable 'ansible_search_path' from source: unknown 11044 1726853236.93533: calling self._execute() 11044 1726853236.93536: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853236.93541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853236.93565: variable 'omit' from source: magic vars 11044 1726853236.93683: _execute() done 11044 1726853236.93692: dumping result to json 11044 1726853236.93701: done dumping result, returning 11044 1726853236.93713: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [02083763-bbaf-c5a6-f857-000000000006] 11044 1726853236.93722: sending task result for task 02083763-bbaf-c5a6-f857-000000000006 11044 1726853236.93910: no more pending results, returning what we have 11044 1726853236.93915: in VariableManager get_vars() 11044 1726853236.93950: Calling all_inventory to load vars for managed_node1 11044 1726853236.93953: Calling groups_inventory to load vars for managed_node1 11044 1726853236.93956: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853236.94083: Calling all_plugins_play to load vars for managed_node1 11044 1726853236.94086: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853236.94090: Calling groups_plugins_play to load vars for managed_node1 11044 1726853236.94366: done sending task result for task 02083763-bbaf-c5a6-f857-000000000006 11044 1726853236.94372: WORKER PROCESS EXITING 11044 1726853236.94394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853236.94585: done with get_vars() 11044 1726853236.94592: variable 'ansible_search_path' from source: unknown 11044 1726853236.94605: we have included files to process 11044 1726853236.94607: generating all_blocks data 11044 1726853236.94608: done generating all_blocks data 11044 1726853236.94609: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11044 1726853236.94610: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11044 1726853236.94612: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11044 1726853236.95553: in VariableManager get_vars() 11044 1726853236.95568: done with get_vars() 11044 1726853236.95581: done processing included file 11044 1726853236.95583: iterating over new_blocks loaded from include file 11044 1726853236.95585: in VariableManager get_vars() 11044 1726853236.95593: done with get_vars() 11044 1726853236.95595: filtering new block on tags 11044 1726853236.95718: done filtering new block on tags 11044 1726853236.95722: in VariableManager get_vars() 11044 1726853236.95732: done with get_vars() 11044 1726853236.95733: filtering new block on tags 11044 1726853236.95751: done filtering new block on tags 11044 1726853236.95754: in VariableManager get_vars() 11044 1726853236.95764: done with get_vars() 11044 1726853236.95766: filtering new block on tags 11044 1726853236.95883: done filtering new block on tags 11044 1726853236.95886: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 11044 1726853236.95892: extending task lists for all hosts with included blocks 11044 1726853236.96057: done extending task lists 11044 1726853236.96059: done processing included files 11044 1726853236.96060: results queue empty 11044 1726853236.96060: checking for any_errors_fatal 11044 1726853236.96062: done checking for any_errors_fatal 11044 1726853236.96062: checking for max_fail_percentage 11044 1726853236.96063: done checking for max_fail_percentage 11044 1726853236.96064: checking to see if all hosts have failed and the running result is not ok 11044 1726853236.96065: done checking to see if all hosts have failed 11044 1726853236.96066: getting the remaining hosts for this loop 11044 1726853236.96067: done getting the remaining hosts for this loop 11044 1726853236.96069: getting the next task for host managed_node1 11044 1726853236.96075: done getting next task for host managed_node1 11044 1726853236.96077: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 11044 1726853236.96080: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853236.96082: getting variables 11044 1726853236.96083: in VariableManager get_vars() 11044 1726853236.96091: Calling all_inventory to load vars for managed_node1 11044 1726853236.96093: Calling groups_inventory to load vars for managed_node1 11044 1726853236.96095: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853236.96100: Calling all_plugins_play to load vars for managed_node1 11044 1726853236.96102: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853236.96105: Calling groups_plugins_play to load vars for managed_node1 11044 1726853236.96721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853236.97408: done with get_vars() 11044 1726853236.97416: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 13:27:16 -0400 (0:00:00.057) 0:00:01.351 ****** 11044 1726853236.97609: entering _queue_task() for managed_node1/setup 11044 1726853236.98405: worker is 1 (out of 1 available) 11044 1726853236.98417: exiting _queue_task() for managed_node1/setup 11044 1726853236.98429: done queuing things up, now waiting for results queue to drain 11044 1726853236.98430: waiting for pending results... 11044 1726853236.98781: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 11044 1726853236.98954: in run() - task 02083763-bbaf-c5a6-f857-0000000000de 11044 1726853236.99048: variable 'ansible_search_path' from source: unknown 11044 1726853236.99057: variable 'ansible_search_path' from source: unknown 11044 1726853236.99104: calling self._execute() 11044 1726853236.99201: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853236.99213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853236.99234: variable 'omit' from source: magic vars 11044 1726853236.99943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853237.03448: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853237.03536: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853237.03879: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853237.03883: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853237.03885: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853237.03978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853237.04180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853237.04184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853237.04314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853237.04317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853237.04749: variable 'ansible_facts' from source: unknown 11044 1726853237.04753: variable 'network_test_required_facts' from source: task vars 11044 1726853237.04755: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 11044 1726853237.04816: variable 'omit' from source: magic vars 11044 1726853237.04880: variable 'omit' from source: magic vars 11044 1726853237.04918: variable 'omit' from source: magic vars 11044 1726853237.04947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853237.04983: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853237.05006: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853237.05030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853237.05046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853237.05085: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853237.05093: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853237.05101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853237.05195: Set connection var ansible_timeout to 10 11044 1726853237.05209: Set connection var ansible_shell_executable to /bin/sh 11044 1726853237.05216: Set connection var ansible_shell_type to sh 11044 1726853237.05225: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853237.05234: Set connection var ansible_connection to ssh 11044 1726853237.05242: Set connection var ansible_pipelining to False 11044 1726853237.05269: variable 'ansible_shell_executable' from source: unknown 11044 1726853237.05279: variable 'ansible_connection' from source: unknown 11044 1726853237.05289: variable 'ansible_module_compression' from source: unknown 11044 1726853237.05296: variable 'ansible_shell_type' from source: unknown 11044 1726853237.05304: variable 'ansible_shell_executable' from source: unknown 11044 1726853237.05310: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853237.05316: variable 'ansible_pipelining' from source: unknown 11044 1726853237.05322: variable 'ansible_timeout' from source: unknown 11044 1726853237.05329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853237.05467: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11044 1726853237.05509: variable 'omit' from source: magic vars 11044 1726853237.05512: starting attempt loop 11044 1726853237.05514: running the handler 11044 1726853237.05516: _low_level_execute_command(): starting 11044 1726853237.05523: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853237.06253: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853237.06273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853237.06385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853237.06402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853237.06421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853237.06508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853237.08177: stdout chunk (state=3): >>>/root <<< 11044 1726853237.08308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853237.08336: stdout chunk (state=3): >>><<< 11044 1726853237.08536: stderr chunk (state=3): >>><<< 11044 1726853237.08542: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853237.08554: _low_level_execute_command(): starting 11044 1726853237.08557: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853237.0845318-11147-73532273450786 `" && echo ansible-tmp-1726853237.0845318-11147-73532273450786="` echo /root/.ansible/tmp/ansible-tmp-1726853237.0845318-11147-73532273450786 `" ) && sleep 0' 11044 1726853237.09064: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853237.09082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853237.09098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853237.09192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853237.09220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853237.09240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853237.09257: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853237.09332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853237.11441: stdout chunk (state=3): >>>ansible-tmp-1726853237.0845318-11147-73532273450786=/root/.ansible/tmp/ansible-tmp-1726853237.0845318-11147-73532273450786 <<< 11044 1726853237.11530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853237.11550: stderr chunk (state=3): >>><<< 11044 1726853237.11586: stdout chunk (state=3): >>><<< 11044 1726853237.11788: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853237.0845318-11147-73532273450786=/root/.ansible/tmp/ansible-tmp-1726853237.0845318-11147-73532273450786 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853237.11792: variable 'ansible_module_compression' from source: unknown 11044 1726853237.11979: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11044 1726853237.12133: variable 'ansible_facts' from source: unknown 11044 1726853237.12373: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853237.0845318-11147-73532273450786/AnsiballZ_setup.py 11044 1726853237.12589: Sending initial data 11044 1726853237.12592: Sent initial data (153 bytes) 11044 1726853237.13210: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853237.13234: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853237.13263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853237.13341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853237.13389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853237.13411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853237.13454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853237.13498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853237.15037: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11044 1726853237.15065: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853237.15096: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853237.15145: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpdjizw2ye /root/.ansible/tmp/ansible-tmp-1726853237.0845318-11147-73532273450786/AnsiballZ_setup.py <<< 11044 1726853237.15149: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853237.0845318-11147-73532273450786/AnsiballZ_setup.py" <<< 11044 1726853237.15199: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpdjizw2ye" to remote "/root/.ansible/tmp/ansible-tmp-1726853237.0845318-11147-73532273450786/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853237.0845318-11147-73532273450786/AnsiballZ_setup.py" <<< 11044 1726853237.17445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853237.17449: stdout chunk (state=3): >>><<< 11044 1726853237.17451: stderr chunk (state=3): >>><<< 11044 1726853237.17453: done transferring module to remote 11044 1726853237.17455: _low_level_execute_command(): starting 11044 1726853237.17457: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853237.0845318-11147-73532273450786/ /root/.ansible/tmp/ansible-tmp-1726853237.0845318-11147-73532273450786/AnsiballZ_setup.py && sleep 0' 11044 1726853237.18049: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853237.18053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853237.18055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853237.18057: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853237.18128: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853237.18135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853237.18188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853237.18241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853237.18261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853237.18306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853237.20208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853237.20229: stdout chunk (state=3): >>><<< 11044 1726853237.20231: stderr chunk (state=3): >>><<< 11044 1726853237.20234: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853237.20237: _low_level_execute_command(): starting 11044 1726853237.20239: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853237.0845318-11147-73532273450786/AnsiballZ_setup.py && sleep 0' 11044 1726853237.21001: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853237.21016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853237.21035: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853237.21066: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853237.21140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853237.23328: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 11044 1726853237.23354: stdout chunk (state=3): >>>import _imp # builtin <<< 11044 1726853237.23396: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 11044 1726853237.23455: stdout chunk (state=3): >>>import '_io' # <<< 11044 1726853237.23484: stdout chunk (state=3): >>>import 'marshal' # <<< 11044 1726853237.23513: stdout chunk (state=3): >>>import 'posix' # <<< 11044 1726853237.23540: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 11044 1726853237.23572: stdout chunk (state=3): >>>import 'time' # <<< 11044 1726853237.23584: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 11044 1726853237.23633: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853237.23653: stdout chunk (state=3): >>>import '_codecs' # <<< 11044 1726853237.23676: stdout chunk (state=3): >>>import 'codecs' # <<< 11044 1726853237.23701: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11044 1726853237.23744: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a493684d0> <<< 11044 1726853237.23773: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49337b30> <<< 11044 1726853237.23777: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 11044 1726853237.23805: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4936aa50> <<< 11044 1726853237.23808: stdout chunk (state=3): >>>import '_signal' # <<< 11044 1726853237.23825: stdout chunk (state=3): >>>import '_abc' # <<< 11044 1726853237.23850: stdout chunk (state=3): >>>import 'abc' # <<< 11044 1726853237.23869: stdout chunk (state=3): >>>import 'io' # <<< 11044 1726853237.23892: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 11044 1726853237.23983: stdout chunk (state=3): >>>import '_collections_abc' # <<< 11044 1726853237.24006: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 11044 1726853237.24039: stdout chunk (state=3): >>>import 'os' # <<< 11044 1726853237.24077: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 11044 1726853237.24080: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 11044 1726853237.24099: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 11044 1726853237.24136: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 11044 1726853237.24157: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 11044 1726853237.24190: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4913d130> <<< 11044 1726853237.24224: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 11044 1726853237.24228: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853237.24247: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4913dfa0> <<< 11044 1726853237.24266: stdout chunk (state=3): >>>import 'site' # <<< 11044 1726853237.24296: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11044 1726853237.24680: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11044 1726853237.24710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 11044 1726853237.24716: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853237.24736: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 11044 1726853237.24782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11044 1726853237.24990: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4917bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4917bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 11044 1726853237.25045: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853237.25074: stdout chunk (state=3): >>>import 'itertools' # <<< 11044 1726853237.25104: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 11044 1726853237.25107: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a491b3830> <<< 11044 1726853237.25137: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 11044 1726853237.25146: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a491b3ec0> <<< 11044 1726853237.25181: stdout chunk (state=3): >>>import '_collections' # <<< 11044 1726853237.25236: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49193b60> <<< 11044 1726853237.25254: stdout chunk (state=3): >>>import '_functools' # <<< 11044 1726853237.25295: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a491912b0> <<< 11044 1726853237.25430: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49179070> <<< 11044 1726853237.25483: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 11044 1726853237.25507: stdout chunk (state=3): >>>import '_sre' # <<< 11044 1726853237.25559: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 11044 1726853237.25583: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 11044 1726853237.25592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 11044 1726853237.25635: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a491d37d0> <<< 11044 1726853237.25659: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a491d23f0> <<< 11044 1726853237.25758: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49192150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a491d0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 11044 1726853237.25772: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49208890> <<< 11044 1726853237.25780: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a491782f0> <<< 11044 1726853237.25808: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 11044 1726853237.25812: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 11044 1726853237.25852: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 11044 1726853237.25859: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a49208d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49208bf0> <<< 11044 1726853237.25927: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a49208fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49176e10> <<< 11044 1726853237.25976: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853237.26053: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49209670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49209370> <<< 11044 1726853237.26085: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 11044 1726853237.26098: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 11044 1726853237.26133: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4920a540> <<< 11044 1726853237.26136: stdout chunk (state=3): >>>import 'importlib.util' # <<< 11044 1726853237.26178: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11044 1726853237.26218: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 11044 1726853237.26254: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49220740> <<< 11044 1726853237.26308: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a49221e20> <<< 11044 1726853237.26343: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 11044 1726853237.26376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 11044 1726853237.26402: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 11044 1726853237.26414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49222cc0> <<< 11044 1726853237.26462: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a492232f0> <<< 11044 1726853237.26479: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49222210> <<< 11044 1726853237.26497: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 11044 1726853237.26548: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11044 1726853237.26801: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 11044 1726853237.26812: stdout chunk (state=3): >>> # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a49223d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a492234a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4920a4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11044 1726853237.26864: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48f17c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 11044 1726853237.26955: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48f407a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48f40500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48f407d0> <<< 11044 1726853237.26965: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 11044 1726853237.26981: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11044 1726853237.27068: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11044 1726853237.27259: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11044 1726853237.27265: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48f41100> <<< 11044 1726853237.27495: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48f41af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48f409b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48f15df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11044 1726853237.27520: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 11044 1726853237.27546: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 11044 1726853237.27565: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 11044 1726853237.27773: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48f42f00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48f41c40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4920ac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 11044 1726853237.27811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 11044 1726853237.27860: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48f6b230> <<< 11044 1726853237.27919: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 11044 1726853237.27931: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853237.27956: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 11044 1726853237.27976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 11044 1726853237.28014: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48f8f5f0> <<< 11044 1726853237.28038: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11044 1726853237.28092: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11044 1726853237.28152: stdout chunk (state=3): >>>import 'ntpath' # <<< 11044 1726853237.28178: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48ff0380> <<< 11044 1726853237.28209: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 11044 1726853237.28234: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11044 1726853237.28250: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11044 1726853237.28285: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11044 1726853237.28368: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48ff2ae0> <<< 11044 1726853237.28473: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48ff04a0> <<< 11044 1726853237.28526: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48fb1370> <<< 11044 1726853237.28530: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48929430> <<< 11044 1726853237.28562: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48f8e3f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48f43e00> <<< 11044 1726853237.29065: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6a48f8e750> <<< 11044 1726853237.29278: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_2dnodk4o/ansible_setup_payload.zip' # zipimport: zlib available <<< 11044 1726853237.29417: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.29462: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc'<<< 11044 1726853237.29473: stdout chunk (state=3): >>> <<< 11044 1726853237.29524: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11044 1726853237.29638: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc'<<< 11044 1726853237.29685: stdout chunk (state=3): >>> <<< 11044 1726853237.29703: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48993170><<< 11044 1726853237.29735: stdout chunk (state=3): >>> import '_typing' # <<< 11044 1726853237.30017: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48972060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a489711c0><<< 11044 1726853237.30038: stdout chunk (state=3): >>> # zipimport: zlib available<<< 11044 1726853237.30091: stdout chunk (state=3): >>> import 'ansible' # <<< 11044 1726853237.30112: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.30153: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 11044 1726853237.30164: stdout chunk (state=3): >>> <<< 11044 1726853237.30204: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 11044 1726853237.32204: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.33308: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48991040> <<< 11044 1726853237.33342: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 11044 1726853237.33368: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 11044 1726853237.33411: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a489c2b10> <<< 11044 1726853237.33455: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a489c28a0> <<< 11044 1726853237.33523: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a489c21b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 11044 1726853237.33614: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a489c2ba0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48993e00> import 'atexit' # <<< 11044 1726853237.33714: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a489c3860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a489c39e0> <<< 11044 1726853237.33721: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11044 1726853237.33779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a489c3ef0> <<< 11044 1726853237.33782: stdout chunk (state=3): >>>import 'pwd' # <<< 11044 1726853237.33829: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11044 1726853237.33869: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11044 1726853237.33942: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4882dc70> <<< 11044 1726853237.33948: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a4882f890> <<< 11044 1726853237.33975: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48830290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11044 1726853237.34213: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 11044 1726853237.34217: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48831430> <<< 11044 1726853237.34241: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48833f20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48f42e70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a488321e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 11044 1726853237.34298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 11044 1726853237.34303: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 11044 1726853237.34320: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11044 1726853237.34463: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 11044 1726853237.34467: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 11044 1726853237.34483: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4883bdd0> import '_tokenize' # <<< 11044 1726853237.34559: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4883a8a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4883a600> <<< 11044 1726853237.34584: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11044 1726853237.34658: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4883ab70> <<< 11044 1726853237.34783: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a488326f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a4887ff80> <<< 11044 1726853237.34853: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4887ffe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 11044 1726853237.34864: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48881b50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48881910> <<< 11044 1726853237.34952: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 11044 1726853237.34989: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48883fe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a488821b0> <<< 11044 1726853237.35080: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11044 1726853237.35105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48887770> <<< 11044 1726853237.35220: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48884140> <<< 11044 1726853237.35362: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48888530> <<< 11044 1726853237.35388: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48888770> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48888aa0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48880260> <<< 11044 1726853237.35423: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 11044 1726853237.35503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a487140e0> <<< 11044 1726853237.35688: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48715310> <<< 11044 1726853237.35756: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4888a870> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a4888bbf0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4888a4e0> <<< 11044 1726853237.35769: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 11044 1726853237.35838: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.36174: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.36180: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 11044 1726853237.36206: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available <<< 11044 1726853237.36227: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.36774: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.37318: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 11044 1726853237.37339: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 11044 1726853237.37366: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 11044 1726853237.37403: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853237.37423: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48719430> <<< 11044 1726853237.37513: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 11044 1726853237.37536: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4871a1b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48888110> <<< 11044 1726853237.37582: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 11044 1726853237.37617: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.37641: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 11044 1726853237.37790: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.37961: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4871a360> <<< 11044 1726853237.37976: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.38423: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.38865: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.38946: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.39010: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 11044 1726853237.39055: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.39177: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 11044 1726853237.39273: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 11044 1726853237.39351: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 11044 1726853237.39354: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.39424: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 11044 1726853237.39620: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.39869: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11044 1726853237.40312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4871b440> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 11044 1726853237.40315: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.40357: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.40415: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.40482: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11044 1726853237.40541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853237.40690: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48726090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a487216a0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 11044 1726853237.40749: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.40808: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.40840: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.40976: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853237.41001: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 11044 1726853237.41057: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11044 1726853237.41187: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4880e900> <<< 11044 1726853237.41238: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a489ee5d0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48726060> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a487183b0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 11044 1726853237.41282: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.41303: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 11044 1726853237.41518: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 11044 1726853237.41525: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.41527: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.41558: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.41565: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.41604: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.41849: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available <<< 11044 1726853237.41876: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.41900: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.41940: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 11044 1726853237.41958: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.42122: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.42442: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853237.42445: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 11044 1726853237.42448: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 11044 1726853237.42477: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 11044 1726853237.42520: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a487b6030> <<< 11044 1726853237.42556: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 11044 1726853237.42561: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 11044 1726853237.42642: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 11044 1726853237.42646: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 11044 1726853237.42659: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4833ff80> <<< 11044 1726853237.42705: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 11044 1726853237.42708: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48354590> <<< 11044 1726853237.42769: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4879e8a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a487b6ba0> <<< 11044 1726853237.42804: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a487b47d0> <<< 11044 1726853237.43197: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a487b4320> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 11044 1726853237.43201: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48357350> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48356c00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48356de0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48356030> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 11044 1726853237.43203: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 11044 1726853237.43210: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48357530> <<< 11044 1726853237.43213: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 11044 1726853237.43239: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 11044 1726853237.43258: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a483a2060> <<< 11044 1726853237.43291: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48357f80> <<< 11044 1726853237.43525: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a487b4380> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 11044 1726853237.43528: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.43531: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 11044 1726853237.43533: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11044 1726853237.43694: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available <<< 11044 1726853237.43730: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 11044 1726853237.43749: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.43786: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.43868: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 11044 1726853237.43875: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.43893: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.43962: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 11044 1726853237.43965: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.44277: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 11044 1726853237.44669: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.45102: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 11044 1726853237.45117: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.45174: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.45280: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.45305: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 11044 1726853237.45336: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.45365: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 11044 1726853237.45383: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.45437: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.45497: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 11044 1726853237.45509: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.45533: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.45564: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 11044 1726853237.45580: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.45607: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.45650: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 11044 1726853237.45658: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.45726: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.45817: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 11044 1726853237.45841: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a483a37d0> <<< 11044 1726853237.45889: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 11044 1726853237.45919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 11044 1726853237.46041: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a483a2db0> import 'ansible.module_utils.facts.system.local' # <<< 11044 1726853237.46053: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.46098: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.46193: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 11044 1726853237.46261: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.46358: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 11044 1726853237.46376: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.46467: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.46537: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 11044 1726853237.46599: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.46610: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 11044 1726853237.46647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 11044 1726853237.46751: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 11044 1726853237.46775: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a483e23c0> <<< 11044 1726853237.47098: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a483d3260> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # <<< 11044 1726853237.47178: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.47191: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.47290: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.47386: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.47532: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 11044 1726853237.47554: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.47583: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.47678: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 11044 1726853237.47681: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.47764: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.47785: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 11044 1726853237.47822: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a483f5d00> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a483d3380> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 11044 1726853237.47893: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.47919: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.47932: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 11044 1726853237.48210: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.48240: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 11044 1726853237.48348: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.48446: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.48482: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.48526: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 11044 1726853237.48560: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.48592: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11044 1726853237.48814: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.48883: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 11044 1726853237.48906: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.49015: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.49135: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 11044 1726853237.49163: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.49181: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.49214: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.49786: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.50292: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 11044 1726853237.50316: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.50408: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.50517: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 11044 1726853237.50543: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.50651: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.50720: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 11044 1726853237.50761: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.50984: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.51057: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 11044 1726853237.51132: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 11044 1726853237.51151: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.51188: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 11044 1726853237.51289: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.51382: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.51584: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.51797: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 11044 1726853237.52131: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 11044 1726853237.52167: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 11044 1726853237.52232: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.52287: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 11044 1726853237.52303: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.52354: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.52477: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 11044 1726853237.52669: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.52934: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 11044 1726853237.52991: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.53053: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 11044 1726853237.53092: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.53141: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 11044 1726853237.53158: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.53203: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.53206: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 11044 1726853237.53475: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.53478: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 11044 1726853237.53522: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 11044 1726853237.53525: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.53562: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 11044 1726853237.53582: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.53616: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11044 1726853237.53655: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.53702: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.53780: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.53869: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 11044 1726853237.53874: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.53911: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.53980: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 11044 1726853237.54160: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.54345: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 11044 1726853237.54369: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.54401: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.54443: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 11044 1726853237.54466: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.54503: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.54541: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 11044 1726853237.54559: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.54631: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.54726: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 11044 1726853237.54729: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.54818: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.54904: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 11044 1726853237.55271: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853237.56361: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 11044 1726853237.56397: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 11044 1726853237.56400: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 11044 1726853237.56477: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a481f7b00> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a481f4830> <<< 11044 1726853237.56577: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a481f5f10> <<< 11044 1726853237.57205: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "27", "second": "17", "epoch": "1726853237", "epoch_int": "1726853237", "date": "2024-09-20", "time": "13:27:17", "iso8601_micro": "2024-09-20T17:27:17.557130Z", "iso8601": "2024-09-20T17:27:17Z", "iso8601_basic": "20240920T132717557130", "iso8601_basic_short": "20240920T132717", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public"<<< 11044 1726853237.57220: stdout chunk (state=3): >>>: "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11044 1726853237.57870: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path <<< 11044 1726853237.57893: stdout chunk (state=3): >>># clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin <<< 11044 1726853237.57918: stdout chunk (state=3): >>># restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 <<< 11044 1726853237.57942: stdout chunk (state=3): >>># cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl <<< 11044 1726853237.57993: stdout chunk (state=3): >>># cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes <<< 11044 1726853237.58038: stdout chunk (state=3): >>># cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils <<< 11044 1726853237.58092: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd <<< 11044 1726853237.58100: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils <<< 11044 1726853237.58132: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 11044 1726853237.58468: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11044 1726853237.58499: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 11044 1726853237.58547: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath <<< 11044 1726853237.58557: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 11044 1726853237.58576: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 11044 1726853237.58606: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 11044 1726853237.58649: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 11044 1726853237.58659: stdout chunk (state=3): >>># destroy distro # destroy distro.distro <<< 11044 1726853237.58667: stdout chunk (state=3): >>># destroy argparse # destroy logging <<< 11044 1726853237.58919: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 11044 1726853237.58943: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 11044 1726853237.58961: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 11044 1726853237.59078: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 11044 1726853237.59085: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11044 1726853237.59249: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 11044 1726853237.59263: stdout chunk (state=3): >>># destroy _collections <<< 11044 1726853237.59292: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 11044 1726853237.59312: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 11044 1726853237.59341: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 11044 1726853237.59366: stdout chunk (state=3): >>># destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 11044 1726853237.59466: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11044 1726853237.59613: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 11044 1726853237.59618: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib <<< 11044 1726853237.59797: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools <<< 11044 1726853237.59801: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11044 1726853237.60221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853237.60233: stderr chunk (state=3): >>><<< 11044 1726853237.60236: stdout chunk (state=3): >>><<< 11044 1726853237.60569: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a493684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49337b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4936aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4913d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4913dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4917bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4917bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a491b3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a491b3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49193b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a491912b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49179070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a491d37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a491d23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49192150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a491d0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49208890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a491782f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a49208d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49208bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a49208fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49176e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49209670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49209370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4920a540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49220740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a49221e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49222cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a492232f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a49222210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a49223d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a492234a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4920a4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48f17c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48f407a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48f40500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48f407d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48f41100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48f41af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48f409b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48f15df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48f42f00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48f41c40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4920ac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48f6b230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48f8f5f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48ff0380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48ff2ae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48ff04a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48fb1370> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48929430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48f8e3f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48f43e00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6a48f8e750> # zipimport: found 103 names in '/tmp/ansible_setup_payload_2dnodk4o/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48993170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48972060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a489711c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48991040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a489c2b10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a489c28a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a489c21b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a489c2ba0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48993e00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a489c3860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a489c39e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a489c3ef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4882dc70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a4882f890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48830290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48831430> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48833f20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48f42e70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a488321e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4883bdd0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4883a8a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4883a600> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4883ab70> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a488326f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a4887ff80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4887ffe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48881b50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48881910> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48883fe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a488821b0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48887770> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48884140> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48888530> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48888770> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48888aa0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48880260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a487140e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48715310> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4888a870> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a4888bbf0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4888a4e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48719430> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4871a1b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48888110> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4871a360> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4871b440> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48726090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a487216a0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4880e900> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a489ee5d0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48726060> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a487183b0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a487b6030> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4833ff80> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48354590> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a4879e8a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a487b6ba0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a487b47d0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a487b4320> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48357350> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48356c00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a48356de0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48356030> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48357530> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a483a2060> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a48357f80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a487b4380> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a483a37d0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a483a2db0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a483e23c0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a483d3260> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a483f5d00> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a483d3380> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6a481f7b00> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a481f4830> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6a481f5f10> {"ansible_facts": {"ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "27", "second": "17", "epoch": "1726853237", "epoch_int": "1726853237", "date": "2024-09-20", "time": "13:27:17", "iso8601_micro": "2024-09-20T17:27:17.557130Z", "iso8601": "2024-09-20T17:27:17Z", "iso8601_basic": "20240920T132717557130", "iso8601_basic_short": "20240920T132717", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 11044 1726853237.61547: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853237.0845318-11147-73532273450786/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853237.61550: _low_level_execute_command(): starting 11044 1726853237.61553: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853237.0845318-11147-73532273450786/ > /dev/null 2>&1 && sleep 0' 11044 1726853237.61556: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853237.61561: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853237.61570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853237.61585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853237.61595: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853237.61608: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853237.61611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853237.61625: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853237.61628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853237.61677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853237.61699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853237.61705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853237.61756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853237.64281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853237.64285: stdout chunk (state=3): >>><<< 11044 1726853237.64355: stderr chunk (state=3): >>><<< 11044 1726853237.64359: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853237.64361: handler run complete 11044 1726853237.64364: variable 'ansible_facts' from source: unknown 11044 1726853237.64415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853237.64577: variable 'ansible_facts' from source: unknown 11044 1726853237.64592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853237.64701: attempt loop complete, returning result 11044 1726853237.64705: _execute() done 11044 1726853237.64707: dumping result to json 11044 1726853237.64709: done dumping result, returning 11044 1726853237.64712: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [02083763-bbaf-c5a6-f857-0000000000de] 11044 1726853237.64714: sending task result for task 02083763-bbaf-c5a6-f857-0000000000de 11044 1726853237.64910: done sending task result for task 02083763-bbaf-c5a6-f857-0000000000de 11044 1726853237.64913: WORKER PROCESS EXITING ok: [managed_node1] 11044 1726853237.65049: no more pending results, returning what we have 11044 1726853237.65052: results queue empty 11044 1726853237.65054: checking for any_errors_fatal 11044 1726853237.65055: done checking for any_errors_fatal 11044 1726853237.65056: checking for max_fail_percentage 11044 1726853237.65058: done checking for max_fail_percentage 11044 1726853237.65058: checking to see if all hosts have failed and the running result is not ok 11044 1726853237.65059: done checking to see if all hosts have failed 11044 1726853237.65060: getting the remaining hosts for this loop 11044 1726853237.65061: done getting the remaining hosts for this loop 11044 1726853237.65065: getting the next task for host managed_node1 11044 1726853237.65077: done getting next task for host managed_node1 11044 1726853237.65080: ^ task is: TASK: Check if system is ostree 11044 1726853237.65083: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853237.65087: getting variables 11044 1726853237.65088: in VariableManager get_vars() 11044 1726853237.65118: Calling all_inventory to load vars for managed_node1 11044 1726853237.65121: Calling groups_inventory to load vars for managed_node1 11044 1726853237.65124: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853237.65136: Calling all_plugins_play to load vars for managed_node1 11044 1726853237.65139: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853237.65142: Calling groups_plugins_play to load vars for managed_node1 11044 1726853237.65382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853237.65537: done with get_vars() 11044 1726853237.65546: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 13:27:17 -0400 (0:00:00.680) 0:00:02.031 ****** 11044 1726853237.65618: entering _queue_task() for managed_node1/stat 11044 1726853237.65819: worker is 1 (out of 1 available) 11044 1726853237.65832: exiting _queue_task() for managed_node1/stat 11044 1726853237.65843: done queuing things up, now waiting for results queue to drain 11044 1726853237.65847: waiting for pending results... 11044 1726853237.65992: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 11044 1726853237.66051: in run() - task 02083763-bbaf-c5a6-f857-0000000000e0 11044 1726853237.66061: variable 'ansible_search_path' from source: unknown 11044 1726853237.66064: variable 'ansible_search_path' from source: unknown 11044 1726853237.66096: calling self._execute() 11044 1726853237.66309: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853237.66313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853237.66316: variable 'omit' from source: magic vars 11044 1726853237.66685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853237.67177: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853237.67181: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853237.67184: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853237.67187: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853237.67312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853237.67381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853237.67415: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853237.67446: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853237.67573: Evaluated conditional (not __network_is_ostree is defined): True 11044 1726853237.67584: variable 'omit' from source: magic vars 11044 1726853237.67630: variable 'omit' from source: magic vars 11044 1726853237.67685: variable 'omit' from source: magic vars 11044 1726853237.67727: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853237.67748: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853237.67776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853237.67785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853237.67796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853237.67835: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853237.67839: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853237.67841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853237.67976: Set connection var ansible_timeout to 10 11044 1726853237.67980: Set connection var ansible_shell_executable to /bin/sh 11044 1726853237.67982: Set connection var ansible_shell_type to sh 11044 1726853237.67984: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853237.67990: Set connection var ansible_connection to ssh 11044 1726853237.68177: Set connection var ansible_pipelining to False 11044 1726853237.68181: variable 'ansible_shell_executable' from source: unknown 11044 1726853237.68184: variable 'ansible_connection' from source: unknown 11044 1726853237.68186: variable 'ansible_module_compression' from source: unknown 11044 1726853237.68189: variable 'ansible_shell_type' from source: unknown 11044 1726853237.68191: variable 'ansible_shell_executable' from source: unknown 11044 1726853237.68193: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853237.68195: variable 'ansible_pipelining' from source: unknown 11044 1726853237.68197: variable 'ansible_timeout' from source: unknown 11044 1726853237.68199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853237.68229: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11044 1726853237.68247: variable 'omit' from source: magic vars 11044 1726853237.68256: starting attempt loop 11044 1726853237.68262: running the handler 11044 1726853237.68281: _low_level_execute_command(): starting 11044 1726853237.68293: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853237.68952: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853237.68956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853237.68958: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853237.68960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853237.69008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853237.69012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853237.69059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11044 1726853237.71276: stdout chunk (state=3): >>>/root <<< 11044 1726853237.71425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853237.71429: stdout chunk (state=3): >>><<< 11044 1726853237.71431: stderr chunk (state=3): >>><<< 11044 1726853237.71457: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11044 1726853237.71569: _low_level_execute_command(): starting 11044 1726853237.71576: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853237.7147403-11186-277636433844309 `" && echo ansible-tmp-1726853237.7147403-11186-277636433844309="` echo /root/.ansible/tmp/ansible-tmp-1726853237.7147403-11186-277636433844309 `" ) && sleep 0' 11044 1726853237.72015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853237.72027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853237.72039: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853237.72083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853237.72091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853237.72105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853237.72168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11044 1726853237.74877: stdout chunk (state=3): >>>ansible-tmp-1726853237.7147403-11186-277636433844309=/root/.ansible/tmp/ansible-tmp-1726853237.7147403-11186-277636433844309 <<< 11044 1726853237.75041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853237.75075: stderr chunk (state=3): >>><<< 11044 1726853237.75078: stdout chunk (state=3): >>><<< 11044 1726853237.75096: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853237.7147403-11186-277636433844309=/root/.ansible/tmp/ansible-tmp-1726853237.7147403-11186-277636433844309 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11044 1726853237.75142: variable 'ansible_module_compression' from source: unknown 11044 1726853237.75187: ANSIBALLZ: Using lock for stat 11044 1726853237.75190: ANSIBALLZ: Acquiring lock 11044 1726853237.75193: ANSIBALLZ: Lock acquired: 140360202229408 11044 1726853237.75195: ANSIBALLZ: Creating module 11044 1726853237.82600: ANSIBALLZ: Writing module into payload 11044 1726853237.82665: ANSIBALLZ: Writing module 11044 1726853237.82683: ANSIBALLZ: Renaming module 11044 1726853237.82689: ANSIBALLZ: Done creating module 11044 1726853237.82703: variable 'ansible_facts' from source: unknown 11044 1726853237.82757: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853237.7147403-11186-277636433844309/AnsiballZ_stat.py 11044 1726853237.82855: Sending initial data 11044 1726853237.82858: Sent initial data (153 bytes) 11044 1726853237.83313: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853237.83317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853237.83319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853237.83323: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853237.83325: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853237.83376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853237.83379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853237.83447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11044 1726853237.85732: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853237.85770: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853237.85822: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpspmnc5c4 /root/.ansible/tmp/ansible-tmp-1726853237.7147403-11186-277636433844309/AnsiballZ_stat.py <<< 11044 1726853237.85825: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853237.7147403-11186-277636433844309/AnsiballZ_stat.py" <<< 11044 1726853237.85859: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpspmnc5c4" to remote "/root/.ansible/tmp/ansible-tmp-1726853237.7147403-11186-277636433844309/AnsiballZ_stat.py" <<< 11044 1726853237.85862: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853237.7147403-11186-277636433844309/AnsiballZ_stat.py" <<< 11044 1726853237.86392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853237.86436: stderr chunk (state=3): >>><<< 11044 1726853237.86439: stdout chunk (state=3): >>><<< 11044 1726853237.86474: done transferring module to remote 11044 1726853237.86487: _low_level_execute_command(): starting 11044 1726853237.86491: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853237.7147403-11186-277636433844309/ /root/.ansible/tmp/ansible-tmp-1726853237.7147403-11186-277636433844309/AnsiballZ_stat.py && sleep 0' 11044 1726853237.86942: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853237.86948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853237.86950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11044 1726853237.86952: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853237.86954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853237.87002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853237.87008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853237.87011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853237.87054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11044 1726853237.89739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853237.89743: stdout chunk (state=3): >>><<< 11044 1726853237.89745: stderr chunk (state=3): >>><<< 11044 1726853237.89748: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11044 1726853237.89751: _low_level_execute_command(): starting 11044 1726853237.89753: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853237.7147403-11186-277636433844309/AnsiballZ_stat.py && sleep 0' 11044 1726853237.90219: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853237.90224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853237.90251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853237.90255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 11044 1726853237.90257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853237.90259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853237.90313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853237.90317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853237.90397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11044 1726853237.93528: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 11044 1726853237.93614: stdout chunk (state=3): >>>import _imp # builtin <<< 11044 1726853237.93626: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 11044 1726853237.93773: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # <<< 11044 1726853237.94013: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 11044 1726853237.94020: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11044 1726853237.94064: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36868184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36867e7b30> <<< 11044 1726853237.94301: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368681aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 11044 1726853237.94387: stdout chunk (state=3): >>>import '_collections_abc' # <<< 11044 1726853237.94426: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 11044 1726853237.94547: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368662d130> <<< 11044 1726853237.94592: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 11044 1726853237.94627: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368662dfa0> <<< 11044 1726853237.94667: stdout chunk (state=3): >>>import 'site' # <<< 11044 1726853237.94952: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11044 1726853237.95101: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368666bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 11044 1726853237.95122: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368666bf80> <<< 11044 1726853237.95143: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11044 1726853237.95184: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 11044 1726853237.95195: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 11044 1726853237.95233: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853237.95356: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866a3830> <<< 11044 1726853237.95359: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866a3ec0> import '_collections' # <<< 11044 1726853237.95416: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3686683b60> import '_functools' # <<< 11044 1726853237.95432: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866812b0> <<< 11044 1726853237.95575: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3686669070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 11044 1726853237.95598: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 11044 1726853237.95634: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 11044 1726853237.95729: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 11044 1726853237.95737: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866c37d0> <<< 11044 1726853237.95910: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866c23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3686682150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866c0bc0> <<< 11044 1726853237.95914: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866f8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866682f0> <<< 11044 1726853237.95969: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36866f8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866f8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36866f8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3686666e10> <<< 11044 1726853237.95989: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 11044 1726853237.96061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866f9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866f9370> import 'importlib.machinery' # <<< 11044 1726853237.96139: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866fa540> <<< 11044 1726853237.96241: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 11044 1726853237.96308: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3686710740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3686711e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 11044 1726853237.96372: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 11044 1726853237.96376: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3686712cc0> <<< 11044 1726853237.96499: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36867132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3686712210> <<< 11044 1726853237.96504: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11044 1726853237.96541: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3686713d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36867134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866fa4b0> <<< 11044 1726853237.96581: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 11044 1726853237.96607: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11044 1726853237.96687: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36864c3c50> <<< 11044 1726853237.96726: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36864ec710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36864ec470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36864ec740> <<< 11044 1726853237.96962: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11044 1726853237.96983: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36864ed070> <<< 11044 1726853237.97106: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36864eda60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36864ec920> <<< 11044 1726853237.97186: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36864c1df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11044 1726853237.97242: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36864eee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36864edb50> <<< 11044 1726853237.97348: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866fac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11044 1726853237.97454: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 11044 1726853237.97502: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36865171a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853237.97576: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 11044 1726853237.97782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368653b560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368659c2c0> <<< 11044 1726853237.97874: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11044 1726853237.97913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11044 1726853237.98092: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368659ea20> <<< 11044 1726853237.98159: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368659c3e0> <<< 11044 1726853237.98288: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368655d2b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36863a13d0> <<< 11044 1726853237.98310: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368653a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36864efd70> <<< 11044 1726853237.98443: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11044 1726853237.98729: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f36863a1670> # zipimport: found 30 names in '/tmp/ansible_stat_payload_imbbekoh/ansible_stat_payload.zip' # zipimport: zlib available <<< 11044 1726853237.98891: stdout chunk (state=3): >>># zipimport: zlib available<<< 11044 1726853237.98942: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 11044 1726853237.98965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11044 1726853237.99030: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py<<< 11044 1726853237.99055: stdout chunk (state=3): >>> <<< 11044 1726853237.99141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 11044 1726853237.99190: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 11044 1726853237.99216: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36863f7170> import '_typing' # <<< 11044 1726853237.99291: stdout chunk (state=3): >>> <<< 11044 1726853237.99546: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36863d6060> <<< 11044 1726853237.99574: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36863d51f0> # zipimport: zlib available <<< 11044 1726853237.99609: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 11044 1726853237.99632: stdout chunk (state=3): >>># zipimport: zlib available<<< 11044 1726853237.99661: stdout chunk (state=3): >>> # zipimport: zlib available <<< 11044 1726853237.99714: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 11044 1726853238.01300: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.03000: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36863f5040> <<< 11044 1726853238.03031: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853238.03079: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 11044 1726853238.03084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 11044 1726853238.03155: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f368641eab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368641e840> <<< 11044 1726853238.03206: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368641e150> <<< 11044 1726853238.03233: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 11044 1726853238.03323: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368641e5a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36863f7b90> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f368641f830> <<< 11044 1726853238.03370: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f368641fa70> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11044 1726853238.03424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 11044 1726853238.03459: stdout chunk (state=3): >>>import '_locale' # <<< 11044 1726853238.03582: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368641ffb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11044 1726853238.03623: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d15d00> <<< 11044 1726853238.03640: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685d17920> <<< 11044 1726853238.03678: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 11044 1726853238.03702: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 11044 1726853238.03722: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d182c0> <<< 11044 1726853238.03788: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 11044 1726853238.03802: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d19460> <<< 11044 1726853238.03816: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 11044 1726853238.03936: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 11044 1726853238.03965: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d1bf20> <<< 11044 1726853238.04082: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685d205c0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d1a210> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 11044 1726853238.04099: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 11044 1726853238.04129: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11044 1726853238.04169: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 11044 1726853238.04187: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 11044 1726853238.04266: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d23f80> import '_tokenize' # <<< 11044 1726853238.04303: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d22a50> <<< 11044 1726853238.04308: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d227b0> <<< 11044 1726853238.04362: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11044 1726853238.04464: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d22d20> <<< 11044 1726853238.04511: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d1a6f0> <<< 11044 1726853238.04515: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685d6c1a0> <<< 11044 1726853238.04540: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d6c350> <<< 11044 1726853238.04577: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 11044 1726853238.04580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 11044 1726853238.04606: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 11044 1726853238.04649: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685d6ddf0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d6dbb0> <<< 11044 1726853238.04664: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11044 1726853238.04789: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 11044 1726853238.04849: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685d70350> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d6e4e0> <<< 11044 1726853238.04874: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11044 1726853238.04911: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853238.04943: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 11044 1726853238.04967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 11044 1726853238.04988: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d73b00> <<< 11044 1726853238.05103: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d704d0> <<< 11044 1726853238.05180: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685d74b60> <<< 11044 1726853238.05194: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685d74950> <<< 11044 1726853238.05277: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685d74bf0> <<< 11044 1726853238.05280: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d6c4a0> <<< 11044 1726853238.05283: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 11044 1726853238.05311: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 11044 1726853238.05338: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 11044 1726853238.05367: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685dfc4d0> <<< 11044 1726853238.05541: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685dfd400> <<< 11044 1726853238.05551: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d76c60> <<< 11044 1726853238.05607: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685d77fe0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d76870> # zipimport: zlib available <<< 11044 1726853238.05610: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.05629: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 11044 1726853238.05697: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.05824: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.05832: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.05860: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 11044 1726853238.05870: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.05979: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.06122: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.06639: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.07219: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 11044 1726853238.07304: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853238.07340: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685c05820> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 11044 1726853238.07365: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685c06cc0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685dfd580> <<< 11044 1726853238.07476: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 11044 1726853238.07496: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 11044 1726853238.07606: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.07803: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685c06de0> # zipimport: zlib available <<< 11044 1726853238.08232: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.08666: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.08733: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.08808: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 11044 1726853238.08830: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.08857: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.08896: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 11044 1726853238.08899: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.08968: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.09080: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 11044 1726853238.09098: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 11044 1726853238.09127: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.09170: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 11044 1726853238.09196: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.09399: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.09714: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 11044 1726853238.09851: stdout chunk (state=3): >>>import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685c079e0> # zipimport: zlib available <<< 11044 1726853238.09861: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.10207: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 11044 1726853238.10216: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11044 1726853238.10233: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11044 1726853238.10265: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853238.10350: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685c123f0> <<< 11044 1726853238.10383: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685c0dbe0> <<< 11044 1726853238.10420: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 11044 1726853238.10448: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.10487: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.10557: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.10579: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.10625: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 11044 1726853238.10688: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 11044 1726853238.10691: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 11044 1726853238.10693: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 11044 1726853238.10741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 11044 1726853238.10778: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11044 1726853238.10829: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d02b10> <<< 11044 1726853238.10862: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36864567e0> <<< 11044 1726853238.10961: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685c08c50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685c0e600> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 11044 1726853238.10964: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.11021: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.11100: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 11044 1726853238.11123: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 11044 1726853238.11139: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.11236: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.11437: stdout chunk (state=3): >>># zipimport: zlib available <<< 11044 1726853238.11590: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 11044 1726853238.11919: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc <<< 11044 1726853238.11974: stdout chunk (state=3): >>># cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re <<< 11044 1726853238.12045: stdout chunk (state=3): >>># cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil <<< 11044 1726853238.12048: stdout chunk (state=3): >>># cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime <<< 11044 1726853238.12051: stdout chunk (state=3): >>># cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 11044 1726853238.12284: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11044 1726853238.12313: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 11044 1726853238.12361: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib <<< 11044 1726853238.12473: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 11044 1726853238.12594: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 11044 1726853238.12727: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11044 1726853238.12797: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections <<< 11044 1726853238.12872: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response <<< 11044 1726853238.13044: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11044 1726853238.13079: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools <<< 11044 1726853238.13109: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11044 1726853238.13441: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853238.13503: stderr chunk (state=3): >>><<< 11044 1726853238.13525: stdout chunk (state=3): >>><<< 11044 1726853238.13658: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36868184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36867e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368681aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368662d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368662dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368666bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368666bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866a3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3686683b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866812b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3686669070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866c23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3686682150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866c0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866f8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36866f8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866f8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36866f8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3686666e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866f9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866f9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866fa540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3686710740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3686711e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3686712cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36867132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3686712210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3686713d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36867134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866fa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36864c3c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36864ec710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36864ec470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36864ec740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36864ed070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36864eda60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36864ec920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36864c1df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36864eee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36864edb50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36866fac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36865171a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368653b560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368659c2c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368659ea20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368659c3e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368655d2b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36863a13d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368653a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36864efd70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f36863a1670> # zipimport: found 30 names in '/tmp/ansible_stat_payload_imbbekoh/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36863f7170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36863d6060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36863d51f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36863f5040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f368641eab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368641e840> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368641e150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368641e5a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36863f7b90> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f368641f830> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f368641fa70> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f368641ffb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d15d00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685d17920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d182c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d19460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d1bf20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685d205c0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d1a210> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d23f80> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d22a50> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d227b0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d22d20> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d1a6f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685d6c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d6c350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685d6ddf0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d6dbb0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685d70350> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d6e4e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d73b00> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d704d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685d74b60> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685d74950> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685d74bf0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d6c4a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685dfc4d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685dfd400> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d76c60> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685d77fe0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d76870> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685c05820> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685c06cc0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685dfd580> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685c06de0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685c079e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3685c123f0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685c0dbe0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685d02b10> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36864567e0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685c08c50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3685c0e600> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 11044 1726853238.14370: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853237.7147403-11186-277636433844309/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853238.14378: _low_level_execute_command(): starting 11044 1726853238.14381: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853237.7147403-11186-277636433844309/ > /dev/null 2>&1 && sleep 0' 11044 1726853238.14701: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853238.14716: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853238.14776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853238.14790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853238.14847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853238.14873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853238.14887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853238.14960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853238.16864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853238.16868: stdout chunk (state=3): >>><<< 11044 1726853238.16874: stderr chunk (state=3): >>><<< 11044 1726853238.16894: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853238.17077: handler run complete 11044 1726853238.17080: attempt loop complete, returning result 11044 1726853238.17082: _execute() done 11044 1726853238.17084: dumping result to json 11044 1726853238.17086: done dumping result, returning 11044 1726853238.17088: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [02083763-bbaf-c5a6-f857-0000000000e0] 11044 1726853238.17090: sending task result for task 02083763-bbaf-c5a6-f857-0000000000e0 11044 1726853238.17158: done sending task result for task 02083763-bbaf-c5a6-f857-0000000000e0 11044 1726853238.17161: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 11044 1726853238.17229: no more pending results, returning what we have 11044 1726853238.17232: results queue empty 11044 1726853238.17233: checking for any_errors_fatal 11044 1726853238.17241: done checking for any_errors_fatal 11044 1726853238.17241: checking for max_fail_percentage 11044 1726853238.17243: done checking for max_fail_percentage 11044 1726853238.17246: checking to see if all hosts have failed and the running result is not ok 11044 1726853238.17247: done checking to see if all hosts have failed 11044 1726853238.17248: getting the remaining hosts for this loop 11044 1726853238.17249: done getting the remaining hosts for this loop 11044 1726853238.17253: getting the next task for host managed_node1 11044 1726853238.17259: done getting next task for host managed_node1 11044 1726853238.17261: ^ task is: TASK: Set flag to indicate system is ostree 11044 1726853238.17264: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853238.17268: getting variables 11044 1726853238.17269: in VariableManager get_vars() 11044 1726853238.17302: Calling all_inventory to load vars for managed_node1 11044 1726853238.17305: Calling groups_inventory to load vars for managed_node1 11044 1726853238.17308: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853238.17319: Calling all_plugins_play to load vars for managed_node1 11044 1726853238.17322: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853238.17325: Calling groups_plugins_play to load vars for managed_node1 11044 1726853238.17732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853238.18038: done with get_vars() 11044 1726853238.18051: done getting variables 11044 1726853238.18158: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 13:27:18 -0400 (0:00:00.525) 0:00:02.557 ****** 11044 1726853238.18189: entering _queue_task() for managed_node1/set_fact 11044 1726853238.18191: Creating lock for set_fact 11044 1726853238.18555: worker is 1 (out of 1 available) 11044 1726853238.18567: exiting _queue_task() for managed_node1/set_fact 11044 1726853238.18581: done queuing things up, now waiting for results queue to drain 11044 1726853238.18583: waiting for pending results... 11044 1726853238.18734: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 11044 1726853238.18808: in run() - task 02083763-bbaf-c5a6-f857-0000000000e1 11044 1726853238.18816: variable 'ansible_search_path' from source: unknown 11044 1726853238.18820: variable 'ansible_search_path' from source: unknown 11044 1726853238.18851: calling self._execute() 11044 1726853238.18908: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853238.18914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853238.18922: variable 'omit' from source: magic vars 11044 1726853238.19265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853238.19491: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853238.19520: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853238.19549: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853238.19578: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853238.19639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853238.19660: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853238.19682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853238.19699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853238.19788: Evaluated conditional (not __network_is_ostree is defined): True 11044 1726853238.19793: variable 'omit' from source: magic vars 11044 1726853238.19820: variable 'omit' from source: magic vars 11044 1726853238.19905: variable '__ostree_booted_stat' from source: set_fact 11044 1726853238.19945: variable 'omit' from source: magic vars 11044 1726853238.19966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853238.19990: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853238.20006: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853238.20018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853238.20027: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853238.20052: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853238.20055: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853238.20058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853238.20152: Set connection var ansible_timeout to 10 11044 1726853238.20162: Set connection var ansible_shell_executable to /bin/sh 11044 1726853238.20166: Set connection var ansible_shell_type to sh 11044 1726853238.20179: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853238.20192: Set connection var ansible_connection to ssh 11044 1726853238.20196: Set connection var ansible_pipelining to False 11044 1726853238.20250: variable 'ansible_shell_executable' from source: unknown 11044 1726853238.20254: variable 'ansible_connection' from source: unknown 11044 1726853238.20257: variable 'ansible_module_compression' from source: unknown 11044 1726853238.20259: variable 'ansible_shell_type' from source: unknown 11044 1726853238.20261: variable 'ansible_shell_executable' from source: unknown 11044 1726853238.20263: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853238.20265: variable 'ansible_pipelining' from source: unknown 11044 1726853238.20268: variable 'ansible_timeout' from source: unknown 11044 1726853238.20272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853238.20476: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853238.20480: variable 'omit' from source: magic vars 11044 1726853238.20482: starting attempt loop 11044 1726853238.20484: running the handler 11044 1726853238.20487: handler run complete 11044 1726853238.20489: attempt loop complete, returning result 11044 1726853238.20491: _execute() done 11044 1726853238.20494: dumping result to json 11044 1726853238.20496: done dumping result, returning 11044 1726853238.20498: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [02083763-bbaf-c5a6-f857-0000000000e1] 11044 1726853238.20500: sending task result for task 02083763-bbaf-c5a6-f857-0000000000e1 11044 1726853238.20565: done sending task result for task 02083763-bbaf-c5a6-f857-0000000000e1 11044 1726853238.20569: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 11044 1726853238.20633: no more pending results, returning what we have 11044 1726853238.20635: results queue empty 11044 1726853238.20636: checking for any_errors_fatal 11044 1726853238.20643: done checking for any_errors_fatal 11044 1726853238.20643: checking for max_fail_percentage 11044 1726853238.20645: done checking for max_fail_percentage 11044 1726853238.20646: checking to see if all hosts have failed and the running result is not ok 11044 1726853238.20646: done checking to see if all hosts have failed 11044 1726853238.20647: getting the remaining hosts for this loop 11044 1726853238.20648: done getting the remaining hosts for this loop 11044 1726853238.20651: getting the next task for host managed_node1 11044 1726853238.20659: done getting next task for host managed_node1 11044 1726853238.20662: ^ task is: TASK: Fix CentOS6 Base repo 11044 1726853238.20665: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853238.20668: getting variables 11044 1726853238.20670: in VariableManager get_vars() 11044 1726853238.20708: Calling all_inventory to load vars for managed_node1 11044 1726853238.20711: Calling groups_inventory to load vars for managed_node1 11044 1726853238.20714: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853238.20724: Calling all_plugins_play to load vars for managed_node1 11044 1726853238.20726: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853238.20734: Calling groups_plugins_play to load vars for managed_node1 11044 1726853238.20979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853238.21140: done with get_vars() 11044 1726853238.21147: done getting variables 11044 1726853238.21260: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 13:27:18 -0400 (0:00:00.030) 0:00:02.588 ****** 11044 1726853238.21288: entering _queue_task() for managed_node1/copy 11044 1726853238.21532: worker is 1 (out of 1 available) 11044 1726853238.21543: exiting _queue_task() for managed_node1/copy 11044 1726853238.21555: done queuing things up, now waiting for results queue to drain 11044 1726853238.21556: waiting for pending results... 11044 1726853238.21798: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 11044 1726853238.21893: in run() - task 02083763-bbaf-c5a6-f857-0000000000e3 11044 1726853238.21979: variable 'ansible_search_path' from source: unknown 11044 1726853238.21982: variable 'ansible_search_path' from source: unknown 11044 1726853238.21984: calling self._execute() 11044 1726853238.22031: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853238.22042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853238.22056: variable 'omit' from source: magic vars 11044 1726853238.22481: variable 'ansible_distribution' from source: facts 11044 1726853238.22498: Evaluated conditional (ansible_distribution == 'CentOS'): True 11044 1726853238.22580: variable 'ansible_distribution_major_version' from source: facts 11044 1726853238.22586: Evaluated conditional (ansible_distribution_major_version == '6'): False 11044 1726853238.22589: when evaluation is False, skipping this task 11044 1726853238.22591: _execute() done 11044 1726853238.22594: dumping result to json 11044 1726853238.22599: done dumping result, returning 11044 1726853238.22605: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [02083763-bbaf-c5a6-f857-0000000000e3] 11044 1726853238.22609: sending task result for task 02083763-bbaf-c5a6-f857-0000000000e3 11044 1726853238.22706: done sending task result for task 02083763-bbaf-c5a6-f857-0000000000e3 11044 1726853238.22710: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11044 1726853238.22781: no more pending results, returning what we have 11044 1726853238.22784: results queue empty 11044 1726853238.22785: checking for any_errors_fatal 11044 1726853238.22788: done checking for any_errors_fatal 11044 1726853238.22789: checking for max_fail_percentage 11044 1726853238.22790: done checking for max_fail_percentage 11044 1726853238.22791: checking to see if all hosts have failed and the running result is not ok 11044 1726853238.22792: done checking to see if all hosts have failed 11044 1726853238.22793: getting the remaining hosts for this loop 11044 1726853238.22794: done getting the remaining hosts for this loop 11044 1726853238.22797: getting the next task for host managed_node1 11044 1726853238.22802: done getting next task for host managed_node1 11044 1726853238.22804: ^ task is: TASK: Include the task 'enable_epel.yml' 11044 1726853238.22806: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853238.22809: getting variables 11044 1726853238.22810: in VariableManager get_vars() 11044 1726853238.22837: Calling all_inventory to load vars for managed_node1 11044 1726853238.22839: Calling groups_inventory to load vars for managed_node1 11044 1726853238.22842: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853238.22853: Calling all_plugins_play to load vars for managed_node1 11044 1726853238.22855: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853238.22864: Calling groups_plugins_play to load vars for managed_node1 11044 1726853238.22979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853238.23093: done with get_vars() 11044 1726853238.23100: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 13:27:18 -0400 (0:00:00.018) 0:00:02.607 ****** 11044 1726853238.23161: entering _queue_task() for managed_node1/include_tasks 11044 1726853238.23356: worker is 1 (out of 1 available) 11044 1726853238.23366: exiting _queue_task() for managed_node1/include_tasks 11044 1726853238.23380: done queuing things up, now waiting for results queue to drain 11044 1726853238.23381: waiting for pending results... 11044 1726853238.23521: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 11044 1726853238.23577: in run() - task 02083763-bbaf-c5a6-f857-0000000000e4 11044 1726853238.23588: variable 'ansible_search_path' from source: unknown 11044 1726853238.23592: variable 'ansible_search_path' from source: unknown 11044 1726853238.23620: calling self._execute() 11044 1726853238.23675: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853238.23718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853238.23721: variable 'omit' from source: magic vars 11044 1726853238.24081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853238.25857: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853238.25913: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853238.25938: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853238.25964: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853238.25984: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853238.26047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853238.26065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853238.26084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853238.26109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853238.26120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853238.26208: variable '__network_is_ostree' from source: set_fact 11044 1726853238.26222: Evaluated conditional (not __network_is_ostree | d(false)): True 11044 1726853238.26228: _execute() done 11044 1726853238.26233: dumping result to json 11044 1726853238.26235: done dumping result, returning 11044 1726853238.26251: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [02083763-bbaf-c5a6-f857-0000000000e4] 11044 1726853238.26254: sending task result for task 02083763-bbaf-c5a6-f857-0000000000e4 11044 1726853238.26331: done sending task result for task 02083763-bbaf-c5a6-f857-0000000000e4 11044 1726853238.26333: WORKER PROCESS EXITING 11044 1726853238.26375: no more pending results, returning what we have 11044 1726853238.26380: in VariableManager get_vars() 11044 1726853238.26413: Calling all_inventory to load vars for managed_node1 11044 1726853238.26416: Calling groups_inventory to load vars for managed_node1 11044 1726853238.26419: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853238.26430: Calling all_plugins_play to load vars for managed_node1 11044 1726853238.26432: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853238.26435: Calling groups_plugins_play to load vars for managed_node1 11044 1726853238.26628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853238.26742: done with get_vars() 11044 1726853238.26748: variable 'ansible_search_path' from source: unknown 11044 1726853238.26749: variable 'ansible_search_path' from source: unknown 11044 1726853238.26776: we have included files to process 11044 1726853238.26777: generating all_blocks data 11044 1726853238.26778: done generating all_blocks data 11044 1726853238.26783: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11044 1726853238.26784: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11044 1726853238.26785: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11044 1726853238.27216: done processing included file 11044 1726853238.27217: iterating over new_blocks loaded from include file 11044 1726853238.27218: in VariableManager get_vars() 11044 1726853238.27226: done with get_vars() 11044 1726853238.27227: filtering new block on tags 11044 1726853238.27244: done filtering new block on tags 11044 1726853238.27246: in VariableManager get_vars() 11044 1726853238.27253: done with get_vars() 11044 1726853238.27254: filtering new block on tags 11044 1726853238.27261: done filtering new block on tags 11044 1726853238.27262: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 11044 1726853238.27266: extending task lists for all hosts with included blocks 11044 1726853238.27324: done extending task lists 11044 1726853238.27324: done processing included files 11044 1726853238.27325: results queue empty 11044 1726853238.27325: checking for any_errors_fatal 11044 1726853238.27327: done checking for any_errors_fatal 11044 1726853238.27328: checking for max_fail_percentage 11044 1726853238.27328: done checking for max_fail_percentage 11044 1726853238.27329: checking to see if all hosts have failed and the running result is not ok 11044 1726853238.27329: done checking to see if all hosts have failed 11044 1726853238.27330: getting the remaining hosts for this loop 11044 1726853238.27331: done getting the remaining hosts for this loop 11044 1726853238.27332: getting the next task for host managed_node1 11044 1726853238.27335: done getting next task for host managed_node1 11044 1726853238.27336: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 11044 1726853238.27338: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853238.27339: getting variables 11044 1726853238.27339: in VariableManager get_vars() 11044 1726853238.27345: Calling all_inventory to load vars for managed_node1 11044 1726853238.27347: Calling groups_inventory to load vars for managed_node1 11044 1726853238.27349: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853238.27354: Calling all_plugins_play to load vars for managed_node1 11044 1726853238.27360: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853238.27362: Calling groups_plugins_play to load vars for managed_node1 11044 1726853238.27457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853238.27569: done with get_vars() 11044 1726853238.27578: done getting variables 11044 1726853238.27622: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 11044 1726853238.27760: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 13:27:18 -0400 (0:00:00.046) 0:00:02.653 ****** 11044 1726853238.27797: entering _queue_task() for managed_node1/command 11044 1726853238.27798: Creating lock for command 11044 1726853238.28023: worker is 1 (out of 1 available) 11044 1726853238.28033: exiting _queue_task() for managed_node1/command 11044 1726853238.28044: done queuing things up, now waiting for results queue to drain 11044 1726853238.28045: waiting for pending results... 11044 1726853238.28202: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 11044 1726853238.28277: in run() - task 02083763-bbaf-c5a6-f857-0000000000fe 11044 1726853238.28287: variable 'ansible_search_path' from source: unknown 11044 1726853238.28291: variable 'ansible_search_path' from source: unknown 11044 1726853238.28315: calling self._execute() 11044 1726853238.28373: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853238.28382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853238.28391: variable 'omit' from source: magic vars 11044 1726853238.28657: variable 'ansible_distribution' from source: facts 11044 1726853238.28665: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11044 1726853238.28754: variable 'ansible_distribution_major_version' from source: facts 11044 1726853238.28757: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11044 1726853238.28761: when evaluation is False, skipping this task 11044 1726853238.28763: _execute() done 11044 1726853238.28767: dumping result to json 11044 1726853238.28772: done dumping result, returning 11044 1726853238.28779: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [02083763-bbaf-c5a6-f857-0000000000fe] 11044 1726853238.28781: sending task result for task 02083763-bbaf-c5a6-f857-0000000000fe 11044 1726853238.28876: done sending task result for task 02083763-bbaf-c5a6-f857-0000000000fe 11044 1726853238.28879: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11044 1726853238.28926: no more pending results, returning what we have 11044 1726853238.28929: results queue empty 11044 1726853238.28930: checking for any_errors_fatal 11044 1726853238.28931: done checking for any_errors_fatal 11044 1726853238.28932: checking for max_fail_percentage 11044 1726853238.28933: done checking for max_fail_percentage 11044 1726853238.28934: checking to see if all hosts have failed and the running result is not ok 11044 1726853238.28935: done checking to see if all hosts have failed 11044 1726853238.28936: getting the remaining hosts for this loop 11044 1726853238.28937: done getting the remaining hosts for this loop 11044 1726853238.28940: getting the next task for host managed_node1 11044 1726853238.28945: done getting next task for host managed_node1 11044 1726853238.28947: ^ task is: TASK: Install yum-utils package 11044 1726853238.28951: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853238.28953: getting variables 11044 1726853238.28955: in VariableManager get_vars() 11044 1726853238.28979: Calling all_inventory to load vars for managed_node1 11044 1726853238.28982: Calling groups_inventory to load vars for managed_node1 11044 1726853238.28984: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853238.28993: Calling all_plugins_play to load vars for managed_node1 11044 1726853238.28995: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853238.28998: Calling groups_plugins_play to load vars for managed_node1 11044 1726853238.29138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853238.29257: done with get_vars() 11044 1726853238.29264: done getting variables 11044 1726853238.29338: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 13:27:18 -0400 (0:00:00.015) 0:00:02.669 ****** 11044 1726853238.29359: entering _queue_task() for managed_node1/package 11044 1726853238.29361: Creating lock for package 11044 1726853238.29556: worker is 1 (out of 1 available) 11044 1726853238.29569: exiting _queue_task() for managed_node1/package 11044 1726853238.29583: done queuing things up, now waiting for results queue to drain 11044 1726853238.29584: waiting for pending results... 11044 1726853238.29727: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 11044 1726853238.29793: in run() - task 02083763-bbaf-c5a6-f857-0000000000ff 11044 1726853238.29802: variable 'ansible_search_path' from source: unknown 11044 1726853238.29807: variable 'ansible_search_path' from source: unknown 11044 1726853238.29836: calling self._execute() 11044 1726853238.29892: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853238.29895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853238.29903: variable 'omit' from source: magic vars 11044 1726853238.30180: variable 'ansible_distribution' from source: facts 11044 1726853238.30190: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11044 1726853238.30275: variable 'ansible_distribution_major_version' from source: facts 11044 1726853238.30281: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11044 1726853238.30284: when evaluation is False, skipping this task 11044 1726853238.30287: _execute() done 11044 1726853238.30289: dumping result to json 11044 1726853238.30294: done dumping result, returning 11044 1726853238.30300: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [02083763-bbaf-c5a6-f857-0000000000ff] 11044 1726853238.30303: sending task result for task 02083763-bbaf-c5a6-f857-0000000000ff 11044 1726853238.30388: done sending task result for task 02083763-bbaf-c5a6-f857-0000000000ff 11044 1726853238.30391: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11044 1726853238.30432: no more pending results, returning what we have 11044 1726853238.30436: results queue empty 11044 1726853238.30437: checking for any_errors_fatal 11044 1726853238.30447: done checking for any_errors_fatal 11044 1726853238.30448: checking for max_fail_percentage 11044 1726853238.30449: done checking for max_fail_percentage 11044 1726853238.30450: checking to see if all hosts have failed and the running result is not ok 11044 1726853238.30450: done checking to see if all hosts have failed 11044 1726853238.30451: getting the remaining hosts for this loop 11044 1726853238.30452: done getting the remaining hosts for this loop 11044 1726853238.30455: getting the next task for host managed_node1 11044 1726853238.30461: done getting next task for host managed_node1 11044 1726853238.30463: ^ task is: TASK: Enable EPEL 7 11044 1726853238.30466: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853238.30469: getting variables 11044 1726853238.30470: in VariableManager get_vars() 11044 1726853238.30494: Calling all_inventory to load vars for managed_node1 11044 1726853238.30496: Calling groups_inventory to load vars for managed_node1 11044 1726853238.30499: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853238.30508: Calling all_plugins_play to load vars for managed_node1 11044 1726853238.30510: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853238.30512: Calling groups_plugins_play to load vars for managed_node1 11044 1726853238.30628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853238.30742: done with get_vars() 11044 1726853238.30750: done getting variables 11044 1726853238.30792: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 13:27:18 -0400 (0:00:00.014) 0:00:02.683 ****** 11044 1726853238.30812: entering _queue_task() for managed_node1/command 11044 1726853238.30996: worker is 1 (out of 1 available) 11044 1726853238.31009: exiting _queue_task() for managed_node1/command 11044 1726853238.31022: done queuing things up, now waiting for results queue to drain 11044 1726853238.31023: waiting for pending results... 11044 1726853238.31176: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 11044 1726853238.31238: in run() - task 02083763-bbaf-c5a6-f857-000000000100 11044 1726853238.31254: variable 'ansible_search_path' from source: unknown 11044 1726853238.31259: variable 'ansible_search_path' from source: unknown 11044 1726853238.31286: calling self._execute() 11044 1726853238.31340: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853238.31345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853238.31360: variable 'omit' from source: magic vars 11044 1726853238.31673: variable 'ansible_distribution' from source: facts 11044 1726853238.31687: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11044 1726853238.31774: variable 'ansible_distribution_major_version' from source: facts 11044 1726853238.31778: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11044 1726853238.31781: when evaluation is False, skipping this task 11044 1726853238.31784: _execute() done 11044 1726853238.31786: dumping result to json 11044 1726853238.31793: done dumping result, returning 11044 1726853238.31797: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [02083763-bbaf-c5a6-f857-000000000100] 11044 1726853238.31804: sending task result for task 02083763-bbaf-c5a6-f857-000000000100 11044 1726853238.31884: done sending task result for task 02083763-bbaf-c5a6-f857-000000000100 11044 1726853238.31887: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11044 1726853238.31939: no more pending results, returning what we have 11044 1726853238.31943: results queue empty 11044 1726853238.31944: checking for any_errors_fatal 11044 1726853238.31949: done checking for any_errors_fatal 11044 1726853238.31949: checking for max_fail_percentage 11044 1726853238.31951: done checking for max_fail_percentage 11044 1726853238.31952: checking to see if all hosts have failed and the running result is not ok 11044 1726853238.31953: done checking to see if all hosts have failed 11044 1726853238.31953: getting the remaining hosts for this loop 11044 1726853238.31955: done getting the remaining hosts for this loop 11044 1726853238.31957: getting the next task for host managed_node1 11044 1726853238.31964: done getting next task for host managed_node1 11044 1726853238.31966: ^ task is: TASK: Enable EPEL 8 11044 1726853238.31970: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853238.31975: getting variables 11044 1726853238.31976: in VariableManager get_vars() 11044 1726853238.32000: Calling all_inventory to load vars for managed_node1 11044 1726853238.32002: Calling groups_inventory to load vars for managed_node1 11044 1726853238.32005: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853238.32014: Calling all_plugins_play to load vars for managed_node1 11044 1726853238.32016: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853238.32018: Calling groups_plugins_play to load vars for managed_node1 11044 1726853238.32167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853238.32287: done with get_vars() 11044 1726853238.32294: done getting variables 11044 1726853238.32334: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 13:27:18 -0400 (0:00:00.015) 0:00:02.699 ****** 11044 1726853238.32355: entering _queue_task() for managed_node1/command 11044 1726853238.32552: worker is 1 (out of 1 available) 11044 1726853238.32565: exiting _queue_task() for managed_node1/command 11044 1726853238.32580: done queuing things up, now waiting for results queue to drain 11044 1726853238.32581: waiting for pending results... 11044 1726853238.32731: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 11044 1726853238.32801: in run() - task 02083763-bbaf-c5a6-f857-000000000101 11044 1726853238.32811: variable 'ansible_search_path' from source: unknown 11044 1726853238.32815: variable 'ansible_search_path' from source: unknown 11044 1726853238.32838: calling self._execute() 11044 1726853238.32894: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853238.32904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853238.32912: variable 'omit' from source: magic vars 11044 1726853238.33185: variable 'ansible_distribution' from source: facts 11044 1726853238.33196: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11044 1726853238.33286: variable 'ansible_distribution_major_version' from source: facts 11044 1726853238.33290: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11044 1726853238.33293: when evaluation is False, skipping this task 11044 1726853238.33296: _execute() done 11044 1726853238.33300: dumping result to json 11044 1726853238.33303: done dumping result, returning 11044 1726853238.33310: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [02083763-bbaf-c5a6-f857-000000000101] 11044 1726853238.33313: sending task result for task 02083763-bbaf-c5a6-f857-000000000101 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11044 1726853238.33446: no more pending results, returning what we have 11044 1726853238.33450: results queue empty 11044 1726853238.33451: checking for any_errors_fatal 11044 1726853238.33457: done checking for any_errors_fatal 11044 1726853238.33457: checking for max_fail_percentage 11044 1726853238.33459: done checking for max_fail_percentage 11044 1726853238.33460: checking to see if all hosts have failed and the running result is not ok 11044 1726853238.33460: done checking to see if all hosts have failed 11044 1726853238.33461: getting the remaining hosts for this loop 11044 1726853238.33462: done getting the remaining hosts for this loop 11044 1726853238.33466: getting the next task for host managed_node1 11044 1726853238.33475: done getting next task for host managed_node1 11044 1726853238.33478: ^ task is: TASK: Enable EPEL 6 11044 1726853238.33483: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853238.33486: getting variables 11044 1726853238.33487: in VariableManager get_vars() 11044 1726853238.33510: Calling all_inventory to load vars for managed_node1 11044 1726853238.33512: Calling groups_inventory to load vars for managed_node1 11044 1726853238.33515: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853238.33524: Calling all_plugins_play to load vars for managed_node1 11044 1726853238.33526: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853238.33528: Calling groups_plugins_play to load vars for managed_node1 11044 1726853238.33646: done sending task result for task 02083763-bbaf-c5a6-f857-000000000101 11044 1726853238.33650: WORKER PROCESS EXITING 11044 1726853238.33660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853238.33775: done with get_vars() 11044 1726853238.33782: done getting variables 11044 1726853238.33824: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 13:27:18 -0400 (0:00:00.014) 0:00:02.714 ****** 11044 1726853238.33843: entering _queue_task() for managed_node1/copy 11044 1726853238.34032: worker is 1 (out of 1 available) 11044 1726853238.34048: exiting _queue_task() for managed_node1/copy 11044 1726853238.34060: done queuing things up, now waiting for results queue to drain 11044 1726853238.34062: waiting for pending results... 11044 1726853238.34210: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 11044 1726853238.34267: in run() - task 02083763-bbaf-c5a6-f857-000000000103 11044 1726853238.34282: variable 'ansible_search_path' from source: unknown 11044 1726853238.34286: variable 'ansible_search_path' from source: unknown 11044 1726853238.34314: calling self._execute() 11044 1726853238.34369: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853238.34376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853238.34384: variable 'omit' from source: magic vars 11044 1726853238.34706: variable 'ansible_distribution' from source: facts 11044 1726853238.34715: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11044 1726853238.34827: variable 'ansible_distribution_major_version' from source: facts 11044 1726853238.34831: Evaluated conditional (ansible_distribution_major_version == '6'): False 11044 1726853238.34836: when evaluation is False, skipping this task 11044 1726853238.34839: _execute() done 11044 1726853238.34842: dumping result to json 11044 1726853238.34847: done dumping result, returning 11044 1726853238.34850: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [02083763-bbaf-c5a6-f857-000000000103] 11044 1726853238.34852: sending task result for task 02083763-bbaf-c5a6-f857-000000000103 11044 1726853238.34935: done sending task result for task 02083763-bbaf-c5a6-f857-000000000103 11044 1726853238.34938: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11044 1726853238.35003: no more pending results, returning what we have 11044 1726853238.35005: results queue empty 11044 1726853238.35006: checking for any_errors_fatal 11044 1726853238.35010: done checking for any_errors_fatal 11044 1726853238.35011: checking for max_fail_percentage 11044 1726853238.35012: done checking for max_fail_percentage 11044 1726853238.35013: checking to see if all hosts have failed and the running result is not ok 11044 1726853238.35014: done checking to see if all hosts have failed 11044 1726853238.35014: getting the remaining hosts for this loop 11044 1726853238.35015: done getting the remaining hosts for this loop 11044 1726853238.35018: getting the next task for host managed_node1 11044 1726853238.35026: done getting next task for host managed_node1 11044 1726853238.35028: ^ task is: TASK: Set network provider to 'nm' 11044 1726853238.35030: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853238.35033: getting variables 11044 1726853238.35034: in VariableManager get_vars() 11044 1726853238.35060: Calling all_inventory to load vars for managed_node1 11044 1726853238.35063: Calling groups_inventory to load vars for managed_node1 11044 1726853238.35065: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853238.35076: Calling all_plugins_play to load vars for managed_node1 11044 1726853238.35078: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853238.35081: Calling groups_plugins_play to load vars for managed_node1 11044 1726853238.35214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853238.35325: done with get_vars() 11044 1726853238.35332: done getting variables 11044 1726853238.35374: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:13 Friday 20 September 2024 13:27:18 -0400 (0:00:00.015) 0:00:02.729 ****** 11044 1726853238.35393: entering _queue_task() for managed_node1/set_fact 11044 1726853238.35585: worker is 1 (out of 1 available) 11044 1726853238.35598: exiting _queue_task() for managed_node1/set_fact 11044 1726853238.35608: done queuing things up, now waiting for results queue to drain 11044 1726853238.35609: waiting for pending results... 11044 1726853238.35759: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 11044 1726853238.35802: in run() - task 02083763-bbaf-c5a6-f857-000000000007 11044 1726853238.35814: variable 'ansible_search_path' from source: unknown 11044 1726853238.35847: calling self._execute() 11044 1726853238.35902: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853238.35907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853238.35914: variable 'omit' from source: magic vars 11044 1726853238.35991: variable 'omit' from source: magic vars 11044 1726853238.36014: variable 'omit' from source: magic vars 11044 1726853238.36039: variable 'omit' from source: magic vars 11044 1726853238.36081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853238.36106: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853238.36122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853238.36135: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853238.36147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853238.36170: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853238.36175: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853238.36177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853238.36251: Set connection var ansible_timeout to 10 11044 1726853238.36255: Set connection var ansible_shell_executable to /bin/sh 11044 1726853238.36257: Set connection var ansible_shell_type to sh 11044 1726853238.36264: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853238.36267: Set connection var ansible_connection to ssh 11044 1726853238.36273: Set connection var ansible_pipelining to False 11044 1726853238.36298: variable 'ansible_shell_executable' from source: unknown 11044 1726853238.36301: variable 'ansible_connection' from source: unknown 11044 1726853238.36304: variable 'ansible_module_compression' from source: unknown 11044 1726853238.36306: variable 'ansible_shell_type' from source: unknown 11044 1726853238.36308: variable 'ansible_shell_executable' from source: unknown 11044 1726853238.36311: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853238.36313: variable 'ansible_pipelining' from source: unknown 11044 1726853238.36315: variable 'ansible_timeout' from source: unknown 11044 1726853238.36317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853238.36421: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853238.36470: variable 'omit' from source: magic vars 11044 1726853238.36476: starting attempt loop 11044 1726853238.36479: running the handler 11044 1726853238.36483: handler run complete 11044 1726853238.36486: attempt loop complete, returning result 11044 1726853238.36488: _execute() done 11044 1726853238.36490: dumping result to json 11044 1726853238.36492: done dumping result, returning 11044 1726853238.36494: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [02083763-bbaf-c5a6-f857-000000000007] 11044 1726853238.36496: sending task result for task 02083763-bbaf-c5a6-f857-000000000007 ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 11044 1726853238.36805: no more pending results, returning what we have 11044 1726853238.36808: results queue empty 11044 1726853238.36809: checking for any_errors_fatal 11044 1726853238.36813: done checking for any_errors_fatal 11044 1726853238.36814: checking for max_fail_percentage 11044 1726853238.36816: done checking for max_fail_percentage 11044 1726853238.36817: checking to see if all hosts have failed and the running result is not ok 11044 1726853238.36817: done checking to see if all hosts have failed 11044 1726853238.36818: getting the remaining hosts for this loop 11044 1726853238.36819: done getting the remaining hosts for this loop 11044 1726853238.36822: getting the next task for host managed_node1 11044 1726853238.36829: done getting next task for host managed_node1 11044 1726853238.36831: ^ task is: TASK: meta (flush_handlers) 11044 1726853238.36833: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853238.36836: getting variables 11044 1726853238.36837: in VariableManager get_vars() 11044 1726853238.36861: Calling all_inventory to load vars for managed_node1 11044 1726853238.36863: Calling groups_inventory to load vars for managed_node1 11044 1726853238.36866: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853238.36878: Calling all_plugins_play to load vars for managed_node1 11044 1726853238.36880: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853238.36884: Calling groups_plugins_play to load vars for managed_node1 11044 1726853238.37032: done sending task result for task 02083763-bbaf-c5a6-f857-000000000007 11044 1726853238.37036: WORKER PROCESS EXITING 11044 1726853238.37057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853238.37425: done with get_vars() 11044 1726853238.37433: done getting variables 11044 1726853238.37501: in VariableManager get_vars() 11044 1726853238.37509: Calling all_inventory to load vars for managed_node1 11044 1726853238.37511: Calling groups_inventory to load vars for managed_node1 11044 1726853238.37514: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853238.37518: Calling all_plugins_play to load vars for managed_node1 11044 1726853238.37520: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853238.37522: Calling groups_plugins_play to load vars for managed_node1 11044 1726853238.37645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853238.37787: done with get_vars() 11044 1726853238.37798: done queuing things up, now waiting for results queue to drain 11044 1726853238.37799: results queue empty 11044 1726853238.37800: checking for any_errors_fatal 11044 1726853238.37801: done checking for any_errors_fatal 11044 1726853238.37801: checking for max_fail_percentage 11044 1726853238.37802: done checking for max_fail_percentage 11044 1726853238.37803: checking to see if all hosts have failed and the running result is not ok 11044 1726853238.37803: done checking to see if all hosts have failed 11044 1726853238.37803: getting the remaining hosts for this loop 11044 1726853238.37804: done getting the remaining hosts for this loop 11044 1726853238.37806: getting the next task for host managed_node1 11044 1726853238.37808: done getting next task for host managed_node1 11044 1726853238.37809: ^ task is: TASK: meta (flush_handlers) 11044 1726853238.37810: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853238.37816: getting variables 11044 1726853238.37817: in VariableManager get_vars() 11044 1726853238.37823: Calling all_inventory to load vars for managed_node1 11044 1726853238.37825: Calling groups_inventory to load vars for managed_node1 11044 1726853238.37826: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853238.37829: Calling all_plugins_play to load vars for managed_node1 11044 1726853238.37831: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853238.37832: Calling groups_plugins_play to load vars for managed_node1 11044 1726853238.37918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853238.38034: done with get_vars() 11044 1726853238.38039: done getting variables 11044 1726853238.38068: in VariableManager get_vars() 11044 1726853238.38076: Calling all_inventory to load vars for managed_node1 11044 1726853238.38077: Calling groups_inventory to load vars for managed_node1 11044 1726853238.38078: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853238.38081: Calling all_plugins_play to load vars for managed_node1 11044 1726853238.38083: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853238.38084: Calling groups_plugins_play to load vars for managed_node1 11044 1726853238.38160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853238.38268: done with get_vars() 11044 1726853238.38277: done queuing things up, now waiting for results queue to drain 11044 1726853238.38278: results queue empty 11044 1726853238.38279: checking for any_errors_fatal 11044 1726853238.38279: done checking for any_errors_fatal 11044 1726853238.38280: checking for max_fail_percentage 11044 1726853238.38280: done checking for max_fail_percentage 11044 1726853238.38281: checking to see if all hosts have failed and the running result is not ok 11044 1726853238.38281: done checking to see if all hosts have failed 11044 1726853238.38282: getting the remaining hosts for this loop 11044 1726853238.38282: done getting the remaining hosts for this loop 11044 1726853238.38284: getting the next task for host managed_node1 11044 1726853238.38286: done getting next task for host managed_node1 11044 1726853238.38286: ^ task is: None 11044 1726853238.38287: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853238.38288: done queuing things up, now waiting for results queue to drain 11044 1726853238.38288: results queue empty 11044 1726853238.38289: checking for any_errors_fatal 11044 1726853238.38289: done checking for any_errors_fatal 11044 1726853238.38289: checking for max_fail_percentage 11044 1726853238.38290: done checking for max_fail_percentage 11044 1726853238.38290: checking to see if all hosts have failed and the running result is not ok 11044 1726853238.38291: done checking to see if all hosts have failed 11044 1726853238.38292: getting the next task for host managed_node1 11044 1726853238.38293: done getting next task for host managed_node1 11044 1726853238.38294: ^ task is: None 11044 1726853238.38294: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853238.38331: in VariableManager get_vars() 11044 1726853238.38350: done with get_vars() 11044 1726853238.38354: in VariableManager get_vars() 11044 1726853238.38363: done with get_vars() 11044 1726853238.38365: variable 'omit' from source: magic vars 11044 1726853238.38388: in VariableManager get_vars() 11044 1726853238.38397: done with get_vars() 11044 1726853238.38411: variable 'omit' from source: magic vars PLAY [Play for testing bond device using deprecated 'master' argument] ********* 11044 1726853238.38816: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 11044 1726853238.38838: getting the remaining hosts for this loop 11044 1726853238.38839: done getting the remaining hosts for this loop 11044 1726853238.38840: getting the next task for host managed_node1 11044 1726853238.38842: done getting next task for host managed_node1 11044 1726853238.38843: ^ task is: TASK: Gathering Facts 11044 1726853238.38845: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853238.38846: getting variables 11044 1726853238.38847: in VariableManager get_vars() 11044 1726853238.38855: Calling all_inventory to load vars for managed_node1 11044 1726853238.38857: Calling groups_inventory to load vars for managed_node1 11044 1726853238.38859: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853238.38863: Calling all_plugins_play to load vars for managed_node1 11044 1726853238.38873: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853238.38875: Calling groups_plugins_play to load vars for managed_node1 11044 1726853238.38968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853238.39074: done with get_vars() 11044 1726853238.39080: done getting variables 11044 1726853238.39110: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:3 Friday 20 September 2024 13:27:18 -0400 (0:00:00.037) 0:00:02.767 ****** 11044 1726853238.39126: entering _queue_task() for managed_node1/gather_facts 11044 1726853238.39347: worker is 1 (out of 1 available) 11044 1726853238.39357: exiting _queue_task() for managed_node1/gather_facts 11044 1726853238.39369: done queuing things up, now waiting for results queue to drain 11044 1726853238.39370: waiting for pending results... 11044 1726853238.39524: running TaskExecutor() for managed_node1/TASK: Gathering Facts 11044 1726853238.39587: in run() - task 02083763-bbaf-c5a6-f857-000000000129 11044 1726853238.39610: variable 'ansible_search_path' from source: unknown 11044 1726853238.39650: calling self._execute() 11044 1726853238.39739: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853238.39743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853238.39911: variable 'omit' from source: magic vars 11044 1726853238.40234: variable 'ansible_distribution_major_version' from source: facts 11044 1726853238.40256: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853238.40267: variable 'omit' from source: magic vars 11044 1726853238.40299: variable 'omit' from source: magic vars 11044 1726853238.40332: variable 'omit' from source: magic vars 11044 1726853238.40376: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853238.40419: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853238.40447: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853238.40470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853238.40490: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853238.40524: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853238.40531: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853238.40537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853238.40638: Set connection var ansible_timeout to 10 11044 1726853238.40653: Set connection var ansible_shell_executable to /bin/sh 11044 1726853238.40660: Set connection var ansible_shell_type to sh 11044 1726853238.40669: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853238.40679: Set connection var ansible_connection to ssh 11044 1726853238.40776: Set connection var ansible_pipelining to False 11044 1726853238.40779: variable 'ansible_shell_executable' from source: unknown 11044 1726853238.40782: variable 'ansible_connection' from source: unknown 11044 1726853238.40784: variable 'ansible_module_compression' from source: unknown 11044 1726853238.40786: variable 'ansible_shell_type' from source: unknown 11044 1726853238.40788: variable 'ansible_shell_executable' from source: unknown 11044 1726853238.40790: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853238.40792: variable 'ansible_pipelining' from source: unknown 11044 1726853238.40795: variable 'ansible_timeout' from source: unknown 11044 1726853238.40797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853238.40935: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853238.40952: variable 'omit' from source: magic vars 11044 1726853238.40962: starting attempt loop 11044 1726853238.40968: running the handler 11044 1726853238.40988: variable 'ansible_facts' from source: unknown 11044 1726853238.41011: _low_level_execute_command(): starting 11044 1726853238.41024: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853238.41749: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853238.41766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853238.41785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853238.41807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853238.41911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853238.41915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853238.42006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11044 1726853238.44041: stdout chunk (state=3): >>>/root <<< 11044 1726853238.44186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853238.44190: stdout chunk (state=3): >>><<< 11044 1726853238.44193: stderr chunk (state=3): >>><<< 11044 1726853238.44286: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11044 1726853238.44290: _low_level_execute_command(): starting 11044 1726853238.44293: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853238.4422023-11230-151164149323466 `" && echo ansible-tmp-1726853238.4422023-11230-151164149323466="` echo /root/.ansible/tmp/ansible-tmp-1726853238.4422023-11230-151164149323466 `" ) && sleep 0' 11044 1726853238.45235: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853238.45343: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853238.45355: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853238.45383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853238.45462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11044 1726853238.47394: stdout chunk (state=3): >>>ansible-tmp-1726853238.4422023-11230-151164149323466=/root/.ansible/tmp/ansible-tmp-1726853238.4422023-11230-151164149323466 <<< 11044 1726853238.47559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853238.47562: stdout chunk (state=3): >>><<< 11044 1726853238.47565: stderr chunk (state=3): >>><<< 11044 1726853238.47808: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853238.4422023-11230-151164149323466=/root/.ansible/tmp/ansible-tmp-1726853238.4422023-11230-151164149323466 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11044 1726853238.47812: variable 'ansible_module_compression' from source: unknown 11044 1726853238.47830: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11044 1726853238.47936: variable 'ansible_facts' from source: unknown 11044 1726853238.48325: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853238.4422023-11230-151164149323466/AnsiballZ_setup.py 11044 1726853238.48738: Sending initial data 11044 1726853238.48783: Sent initial data (154 bytes) 11044 1726853238.49985: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853238.50177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853238.50296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853238.50359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853238.52441: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11044 1726853238.52465: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853238.52539: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853238.52583: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpxyord96o /root/.ansible/tmp/ansible-tmp-1726853238.4422023-11230-151164149323466/AnsiballZ_setup.py <<< 11044 1726853238.52606: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853238.4422023-11230-151164149323466/AnsiballZ_setup.py" <<< 11044 1726853238.52652: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpxyord96o" to remote "/root/.ansible/tmp/ansible-tmp-1726853238.4422023-11230-151164149323466/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853238.4422023-11230-151164149323466/AnsiballZ_setup.py" <<< 11044 1726853238.54578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853238.54582: stdout chunk (state=3): >>><<< 11044 1726853238.54584: stderr chunk (state=3): >>><<< 11044 1726853238.54586: done transferring module to remote 11044 1726853238.54588: _low_level_execute_command(): starting 11044 1726853238.54590: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853238.4422023-11230-151164149323466/ /root/.ansible/tmp/ansible-tmp-1726853238.4422023-11230-151164149323466/AnsiballZ_setup.py && sleep 0' 11044 1726853238.55265: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853238.55320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853238.55341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853238.55373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853238.55441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11044 1726853238.58063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853238.58066: stdout chunk (state=3): >>><<< 11044 1726853238.58069: stderr chunk (state=3): >>><<< 11044 1726853238.58074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11044 1726853238.58077: _low_level_execute_command(): starting 11044 1726853238.58080: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853238.4422023-11230-151164149323466/AnsiballZ_setup.py && sleep 0' 11044 1726853238.58825: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853238.58840: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853238.58856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853238.58875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853238.58894: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853238.58907: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853238.58922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853238.58941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853238.58959: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 11044 1726853238.58970: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11044 1726853238.58991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853238.59084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853238.59097: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853238.59114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853238.59193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853239.38597: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumbe<<< 11044 1726853239.38618: stdout chunk (state=3): >>>r": "38", "day": "20", "hour": "13", "minute": "27", "second": "18", "epoch": "1726853238", "epoch_int": "1726853238", "date": "2024-09-20", "time": "13:27:18", "iso8601_micro": "2024-09-20T17:27:18.946524Z", "iso8601": "2024-09-20T17:27:18Z", "iso8601_basic": "20240920T132718946524", "iso8601_basic_short": "20240920T132718", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2956, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 575, "free": 2956}, "nocache": {"free": 3289, "used": 242}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 405, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261793607680, "block_size": 4096, "block_total": 65519099, "block_available": 63914455, "block_used": 1604644, "inode_total": 131070960, "inode_available": 131029084, "inode_used": 41876, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_loadavg": {"1m": 0.55859375, "5m": 0.25244140625, "15m": 0.1025390625}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_<<< 11044 1726853239.38631: stdout chunk (state=3): >>>netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11044 1726853239.41337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853239.41363: stderr chunk (state=3): >>><<< 11044 1726853239.41368: stdout chunk (state=3): >>><<< 11044 1726853239.41406: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "27", "second": "18", "epoch": "1726853238", "epoch_int": "1726853238", "date": "2024-09-20", "time": "13:27:18", "iso8601_micro": "2024-09-20T17:27:18.946524Z", "iso8601": "2024-09-20T17:27:18Z", "iso8601_basic": "20240920T132718946524", "iso8601_basic_short": "20240920T132718", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2956, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 575, "free": 2956}, "nocache": {"free": 3289, "used": 242}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 405, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261793607680, "block_size": 4096, "block_total": 65519099, "block_available": 63914455, "block_used": 1604644, "inode_total": 131070960, "inode_available": 131029084, "inode_used": 41876, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_loadavg": {"1m": 0.55859375, "5m": 0.25244140625, "15m": 0.1025390625}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853239.41618: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853238.4422023-11230-151164149323466/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853239.41635: _low_level_execute_command(): starting 11044 1726853239.41639: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853238.4422023-11230-151164149323466/ > /dev/null 2>&1 && sleep 0' 11044 1726853239.42089: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853239.42093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853239.42095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11044 1726853239.42098: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853239.42100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853239.42156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853239.42160: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853239.42202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853239.44829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853239.44854: stderr chunk (state=3): >>><<< 11044 1726853239.44859: stdout chunk (state=3): >>><<< 11044 1726853239.44878: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853239.44885: handler run complete 11044 1726853239.44959: variable 'ansible_facts' from source: unknown 11044 1726853239.45036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853239.45218: variable 'ansible_facts' from source: unknown 11044 1726853239.45269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853239.45349: attempt loop complete, returning result 11044 1726853239.45353: _execute() done 11044 1726853239.45355: dumping result to json 11044 1726853239.45373: done dumping result, returning 11044 1726853239.45381: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-c5a6-f857-000000000129] 11044 1726853239.45384: sending task result for task 02083763-bbaf-c5a6-f857-000000000129 11044 1726853239.45628: done sending task result for task 02083763-bbaf-c5a6-f857-000000000129 11044 1726853239.45631: WORKER PROCESS EXITING ok: [managed_node1] 11044 1726853239.45839: no more pending results, returning what we have 11044 1726853239.45841: results queue empty 11044 1726853239.45842: checking for any_errors_fatal 11044 1726853239.45842: done checking for any_errors_fatal 11044 1726853239.45843: checking for max_fail_percentage 11044 1726853239.45844: done checking for max_fail_percentage 11044 1726853239.45845: checking to see if all hosts have failed and the running result is not ok 11044 1726853239.45845: done checking to see if all hosts have failed 11044 1726853239.45846: getting the remaining hosts for this loop 11044 1726853239.45847: done getting the remaining hosts for this loop 11044 1726853239.45849: getting the next task for host managed_node1 11044 1726853239.45853: done getting next task for host managed_node1 11044 1726853239.45854: ^ task is: TASK: meta (flush_handlers) 11044 1726853239.45856: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853239.45858: getting variables 11044 1726853239.45859: in VariableManager get_vars() 11044 1726853239.45884: Calling all_inventory to load vars for managed_node1 11044 1726853239.45886: Calling groups_inventory to load vars for managed_node1 11044 1726853239.45887: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853239.45894: Calling all_plugins_play to load vars for managed_node1 11044 1726853239.45898: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853239.45901: Calling groups_plugins_play to load vars for managed_node1 11044 1726853239.45995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853239.46107: done with get_vars() 11044 1726853239.46115: done getting variables 11044 1726853239.46164: in VariableManager get_vars() 11044 1726853239.46176: Calling all_inventory to load vars for managed_node1 11044 1726853239.46178: Calling groups_inventory to load vars for managed_node1 11044 1726853239.46179: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853239.46182: Calling all_plugins_play to load vars for managed_node1 11044 1726853239.46183: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853239.46185: Calling groups_plugins_play to load vars for managed_node1 11044 1726853239.46277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853239.46385: done with get_vars() 11044 1726853239.46393: done queuing things up, now waiting for results queue to drain 11044 1726853239.46394: results queue empty 11044 1726853239.46395: checking for any_errors_fatal 11044 1726853239.46397: done checking for any_errors_fatal 11044 1726853239.46397: checking for max_fail_percentage 11044 1726853239.46403: done checking for max_fail_percentage 11044 1726853239.46403: checking to see if all hosts have failed and the running result is not ok 11044 1726853239.46404: done checking to see if all hosts have failed 11044 1726853239.46404: getting the remaining hosts for this loop 11044 1726853239.46405: done getting the remaining hosts for this loop 11044 1726853239.46406: getting the next task for host managed_node1 11044 1726853239.46409: done getting next task for host managed_node1 11044 1726853239.46410: ^ task is: TASK: INIT Prepare setup 11044 1726853239.46411: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853239.46413: getting variables 11044 1726853239.46413: in VariableManager get_vars() 11044 1726853239.46421: Calling all_inventory to load vars for managed_node1 11044 1726853239.46422: Calling groups_inventory to load vars for managed_node1 11044 1726853239.46424: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853239.46426: Calling all_plugins_play to load vars for managed_node1 11044 1726853239.46428: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853239.46429: Calling groups_plugins_play to load vars for managed_node1 11044 1726853239.46508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853239.46615: done with get_vars() 11044 1726853239.46622: done getting variables 11044 1726853239.46677: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:15 Friday 20 September 2024 13:27:19 -0400 (0:00:01.075) 0:00:03.842 ****** 11044 1726853239.46696: entering _queue_task() for managed_node1/debug 11044 1726853239.46697: Creating lock for debug 11044 1726853239.46905: worker is 1 (out of 1 available) 11044 1726853239.46918: exiting _queue_task() for managed_node1/debug 11044 1726853239.46928: done queuing things up, now waiting for results queue to drain 11044 1726853239.46929: waiting for pending results... 11044 1726853239.47084: running TaskExecutor() for managed_node1/TASK: INIT Prepare setup 11044 1726853239.47135: in run() - task 02083763-bbaf-c5a6-f857-00000000000b 11044 1726853239.47145: variable 'ansible_search_path' from source: unknown 11044 1726853239.47180: calling self._execute() 11044 1726853239.47240: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853239.47245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853239.47255: variable 'omit' from source: magic vars 11044 1726853239.47527: variable 'ansible_distribution_major_version' from source: facts 11044 1726853239.47536: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853239.47542: variable 'omit' from source: magic vars 11044 1726853239.47560: variable 'omit' from source: magic vars 11044 1726853239.47598: variable 'omit' from source: magic vars 11044 1726853239.47626: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853239.47656: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853239.47672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853239.47685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853239.47698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853239.47722: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853239.47725: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853239.47728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853239.47798: Set connection var ansible_timeout to 10 11044 1726853239.47804: Set connection var ansible_shell_executable to /bin/sh 11044 1726853239.47809: Set connection var ansible_shell_type to sh 11044 1726853239.47812: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853239.47814: Set connection var ansible_connection to ssh 11044 1726853239.47825: Set connection var ansible_pipelining to False 11044 1726853239.47840: variable 'ansible_shell_executable' from source: unknown 11044 1726853239.47843: variable 'ansible_connection' from source: unknown 11044 1726853239.47846: variable 'ansible_module_compression' from source: unknown 11044 1726853239.47850: variable 'ansible_shell_type' from source: unknown 11044 1726853239.47853: variable 'ansible_shell_executable' from source: unknown 11044 1726853239.47855: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853239.47859: variable 'ansible_pipelining' from source: unknown 11044 1726853239.47862: variable 'ansible_timeout' from source: unknown 11044 1726853239.47866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853239.47965: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853239.47974: variable 'omit' from source: magic vars 11044 1726853239.47977: starting attempt loop 11044 1726853239.47980: running the handler 11044 1726853239.48017: handler run complete 11044 1726853239.48034: attempt loop complete, returning result 11044 1726853239.48037: _execute() done 11044 1726853239.48039: dumping result to json 11044 1726853239.48041: done dumping result, returning 11044 1726853239.48053: done running TaskExecutor() for managed_node1/TASK: INIT Prepare setup [02083763-bbaf-c5a6-f857-00000000000b] 11044 1726853239.48055: sending task result for task 02083763-bbaf-c5a6-f857-00000000000b 11044 1726853239.48132: done sending task result for task 02083763-bbaf-c5a6-f857-00000000000b 11044 1726853239.48135: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: ################################################## 11044 1726853239.48181: no more pending results, returning what we have 11044 1726853239.48184: results queue empty 11044 1726853239.48185: checking for any_errors_fatal 11044 1726853239.48187: done checking for any_errors_fatal 11044 1726853239.48187: checking for max_fail_percentage 11044 1726853239.48189: done checking for max_fail_percentage 11044 1726853239.48190: checking to see if all hosts have failed and the running result is not ok 11044 1726853239.48190: done checking to see if all hosts have failed 11044 1726853239.48191: getting the remaining hosts for this loop 11044 1726853239.48192: done getting the remaining hosts for this loop 11044 1726853239.48195: getting the next task for host managed_node1 11044 1726853239.48201: done getting next task for host managed_node1 11044 1726853239.48203: ^ task is: TASK: Install dnsmasq 11044 1726853239.48206: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853239.48209: getting variables 11044 1726853239.48210: in VariableManager get_vars() 11044 1726853239.48246: Calling all_inventory to load vars for managed_node1 11044 1726853239.48249: Calling groups_inventory to load vars for managed_node1 11044 1726853239.48251: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853239.48259: Calling all_plugins_play to load vars for managed_node1 11044 1726853239.48261: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853239.48263: Calling groups_plugins_play to load vars for managed_node1 11044 1726853239.48425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853239.48542: done with get_vars() 11044 1726853239.48551: done getting variables 11044 1726853239.48589: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 13:27:19 -0400 (0:00:00.019) 0:00:03.861 ****** 11044 1726853239.48612: entering _queue_task() for managed_node1/package 11044 1726853239.48801: worker is 1 (out of 1 available) 11044 1726853239.48812: exiting _queue_task() for managed_node1/package 11044 1726853239.48823: done queuing things up, now waiting for results queue to drain 11044 1726853239.48825: waiting for pending results... 11044 1726853239.48974: running TaskExecutor() for managed_node1/TASK: Install dnsmasq 11044 1726853239.49039: in run() - task 02083763-bbaf-c5a6-f857-00000000000f 11044 1726853239.49057: variable 'ansible_search_path' from source: unknown 11044 1726853239.49061: variable 'ansible_search_path' from source: unknown 11044 1726853239.49085: calling self._execute() 11044 1726853239.49147: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853239.49151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853239.49163: variable 'omit' from source: magic vars 11044 1726853239.49421: variable 'ansible_distribution_major_version' from source: facts 11044 1726853239.49431: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853239.49436: variable 'omit' from source: magic vars 11044 1726853239.49469: variable 'omit' from source: magic vars 11044 1726853239.49598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853239.50979: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853239.51023: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853239.51050: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853239.51076: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853239.51096: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853239.51165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853239.51185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853239.51202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853239.51231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853239.51243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853239.51314: variable '__network_is_ostree' from source: set_fact 11044 1726853239.51318: variable 'omit' from source: magic vars 11044 1726853239.51343: variable 'omit' from source: magic vars 11044 1726853239.51365: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853239.51387: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853239.51400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853239.51413: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853239.51421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853239.51448: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853239.51452: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853239.51456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853239.51519: Set connection var ansible_timeout to 10 11044 1726853239.51526: Set connection var ansible_shell_executable to /bin/sh 11044 1726853239.51529: Set connection var ansible_shell_type to sh 11044 1726853239.51533: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853239.51538: Set connection var ansible_connection to ssh 11044 1726853239.51543: Set connection var ansible_pipelining to False 11044 1726853239.51563: variable 'ansible_shell_executable' from source: unknown 11044 1726853239.51568: variable 'ansible_connection' from source: unknown 11044 1726853239.51572: variable 'ansible_module_compression' from source: unknown 11044 1726853239.51575: variable 'ansible_shell_type' from source: unknown 11044 1726853239.51577: variable 'ansible_shell_executable' from source: unknown 11044 1726853239.51579: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853239.51582: variable 'ansible_pipelining' from source: unknown 11044 1726853239.51584: variable 'ansible_timeout' from source: unknown 11044 1726853239.51586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853239.51651: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853239.51663: variable 'omit' from source: magic vars 11044 1726853239.51666: starting attempt loop 11044 1726853239.51669: running the handler 11044 1726853239.51673: variable 'ansible_facts' from source: unknown 11044 1726853239.51676: variable 'ansible_facts' from source: unknown 11044 1726853239.51704: _low_level_execute_command(): starting 11044 1726853239.51707: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853239.52194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853239.52200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853239.52217: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853239.52262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853239.52266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853239.52278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853239.52336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853239.54680: stdout chunk (state=3): >>>/root <<< 11044 1726853239.54825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853239.54860: stderr chunk (state=3): >>><<< 11044 1726853239.54863: stdout chunk (state=3): >>><<< 11044 1726853239.54885: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853239.54896: _low_level_execute_command(): starting 11044 1726853239.54902: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853239.548847-11288-189511288847430 `" && echo ansible-tmp-1726853239.548847-11288-189511288847430="` echo /root/.ansible/tmp/ansible-tmp-1726853239.548847-11288-189511288847430 `" ) && sleep 0' 11044 1726853239.55355: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853239.55358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853239.55360: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853239.55363: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853239.55365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853239.55419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853239.55425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853239.55469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853239.58215: stdout chunk (state=3): >>>ansible-tmp-1726853239.548847-11288-189511288847430=/root/.ansible/tmp/ansible-tmp-1726853239.548847-11288-189511288847430 <<< 11044 1726853239.58345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853239.58376: stderr chunk (state=3): >>><<< 11044 1726853239.58379: stdout chunk (state=3): >>><<< 11044 1726853239.58394: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853239.548847-11288-189511288847430=/root/.ansible/tmp/ansible-tmp-1726853239.548847-11288-189511288847430 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853239.58425: variable 'ansible_module_compression' from source: unknown 11044 1726853239.58474: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 11044 1726853239.58478: ANSIBALLZ: Acquiring lock 11044 1726853239.58481: ANSIBALLZ: Lock acquired: 140360202229168 11044 1726853239.58483: ANSIBALLZ: Creating module 11044 1726853239.68767: ANSIBALLZ: Writing module into payload 11044 1726853239.68906: ANSIBALLZ: Writing module 11044 1726853239.68924: ANSIBALLZ: Renaming module 11044 1726853239.68929: ANSIBALLZ: Done creating module 11044 1726853239.68945: variable 'ansible_facts' from source: unknown 11044 1726853239.69008: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853239.548847-11288-189511288847430/AnsiballZ_dnf.py 11044 1726853239.69110: Sending initial data 11044 1726853239.69113: Sent initial data (151 bytes) 11044 1726853239.69579: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853239.69583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853239.69585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 11044 1726853239.69587: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853239.69589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853239.69641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853239.69644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853239.69647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853239.69705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853239.72014: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853239.72056: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853239.72102: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmp57j7t6qw /root/.ansible/tmp/ansible-tmp-1726853239.548847-11288-189511288847430/AnsiballZ_dnf.py <<< 11044 1726853239.72105: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853239.548847-11288-189511288847430/AnsiballZ_dnf.py" <<< 11044 1726853239.72141: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmp57j7t6qw" to remote "/root/.ansible/tmp/ansible-tmp-1726853239.548847-11288-189511288847430/AnsiballZ_dnf.py" <<< 11044 1726853239.72148: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853239.548847-11288-189511288847430/AnsiballZ_dnf.py" <<< 11044 1726853239.72847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853239.72889: stderr chunk (state=3): >>><<< 11044 1726853239.72892: stdout chunk (state=3): >>><<< 11044 1726853239.72923: done transferring module to remote 11044 1726853239.72932: _low_level_execute_command(): starting 11044 1726853239.72940: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853239.548847-11288-189511288847430/ /root/.ansible/tmp/ansible-tmp-1726853239.548847-11288-189511288847430/AnsiballZ_dnf.py && sleep 0' 11044 1726853239.73400: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853239.73403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853239.73406: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 11044 1726853239.73408: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853239.73410: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853239.73460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853239.73463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853239.73466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853239.73511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853239.75999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853239.76021: stderr chunk (state=3): >>><<< 11044 1726853239.76024: stdout chunk (state=3): >>><<< 11044 1726853239.76037: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853239.76040: _low_level_execute_command(): starting 11044 1726853239.76045: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853239.548847-11288-189511288847430/AnsiballZ_dnf.py && sleep 0' 11044 1726853239.76487: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853239.76491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853239.76493: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853239.76495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853239.76497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853239.76543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853239.76553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853239.76599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853241.54410: stdout chunk (state=3): >>> {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-3.el10.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11044 1726853241.59552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853241.59556: stdout chunk (state=3): >>><<< 11044 1726853241.59558: stderr chunk (state=3): >>><<< 11044 1726853241.59764: _low_level_execute_command() done: rc=0, stdout= {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-3.el10.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853241.59776: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853239.548847-11288-189511288847430/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853241.59779: _low_level_execute_command(): starting 11044 1726853241.59781: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853239.548847-11288-189511288847430/ > /dev/null 2>&1 && sleep 0' 11044 1726853241.60840: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853241.60874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853241.60988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853241.61021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853241.61039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853241.61066: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853241.61145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853241.63198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853241.63201: stdout chunk (state=3): >>><<< 11044 1726853241.63208: stderr chunk (state=3): >>><<< 11044 1726853241.63214: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853241.63216: handler run complete 11044 1726853241.63980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853241.63984: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853241.64319: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853241.64361: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853241.64397: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853241.64495: variable '__install_status' from source: unknown 11044 1726853241.64660: Evaluated conditional (__install_status is success): True 11044 1726853241.64683: attempt loop complete, returning result 11044 1726853241.64690: _execute() done 11044 1726853241.64696: dumping result to json 11044 1726853241.64705: done dumping result, returning 11044 1726853241.64756: done running TaskExecutor() for managed_node1/TASK: Install dnsmasq [02083763-bbaf-c5a6-f857-00000000000f] 11044 1726853241.64764: sending task result for task 02083763-bbaf-c5a6-f857-00000000000f changed: [managed_node1] => { "attempts": 1, "changed": true, "rc": 0, "results": [ "Installed: dnsmasq-2.90-3.el10.x86_64" ] } 11044 1726853241.65047: no more pending results, returning what we have 11044 1726853241.65050: results queue empty 11044 1726853241.65051: checking for any_errors_fatal 11044 1726853241.65058: done checking for any_errors_fatal 11044 1726853241.65059: checking for max_fail_percentage 11044 1726853241.65061: done checking for max_fail_percentage 11044 1726853241.65062: checking to see if all hosts have failed and the running result is not ok 11044 1726853241.65063: done checking to see if all hosts have failed 11044 1726853241.65064: getting the remaining hosts for this loop 11044 1726853241.65065: done getting the remaining hosts for this loop 11044 1726853241.65068: getting the next task for host managed_node1 11044 1726853241.65076: done getting next task for host managed_node1 11044 1726853241.65078: ^ task is: TASK: Install pgrep, sysctl 11044 1726853241.65081: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853241.65084: getting variables 11044 1726853241.65086: in VariableManager get_vars() 11044 1726853241.65126: Calling all_inventory to load vars for managed_node1 11044 1726853241.65129: Calling groups_inventory to load vars for managed_node1 11044 1726853241.65131: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853241.65143: Calling all_plugins_play to load vars for managed_node1 11044 1726853241.65146: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853241.65149: Calling groups_plugins_play to load vars for managed_node1 11044 1726853241.65730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853241.66385: done sending task result for task 02083763-bbaf-c5a6-f857-00000000000f 11044 1726853241.66388: WORKER PROCESS EXITING 11044 1726853241.66413: done with get_vars() 11044 1726853241.66425: done getting variables 11044 1726853241.66486: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 13:27:21 -0400 (0:00:02.180) 0:00:06.042 ****** 11044 1726853241.66633: entering _queue_task() for managed_node1/package 11044 1726853241.67147: worker is 1 (out of 1 available) 11044 1726853241.67275: exiting _queue_task() for managed_node1/package 11044 1726853241.67288: done queuing things up, now waiting for results queue to drain 11044 1726853241.67290: waiting for pending results... 11044 1726853241.67600: running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl 11044 1726853241.67698: in run() - task 02083763-bbaf-c5a6-f857-000000000010 11044 1726853241.67750: variable 'ansible_search_path' from source: unknown 11044 1726853241.67759: variable 'ansible_search_path' from source: unknown 11044 1726853241.67800: calling self._execute() 11044 1726853241.67890: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853241.67901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853241.67915: variable 'omit' from source: magic vars 11044 1726853241.68359: variable 'ansible_distribution_major_version' from source: facts 11044 1726853241.68362: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853241.68423: variable 'ansible_os_family' from source: facts 11044 1726853241.68434: Evaluated conditional (ansible_os_family == 'RedHat'): True 11044 1726853241.68620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853241.68964: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853241.69015: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853241.69057: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853241.69097: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853241.69181: variable 'ansible_distribution_major_version' from source: facts 11044 1726853241.69197: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 11044 1726853241.69204: when evaluation is False, skipping this task 11044 1726853241.69211: _execute() done 11044 1726853241.69218: dumping result to json 11044 1726853241.69241: done dumping result, returning 11044 1726853241.69244: done running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl [02083763-bbaf-c5a6-f857-000000000010] 11044 1726853241.69338: sending task result for task 02083763-bbaf-c5a6-f857-000000000010 11044 1726853241.69409: done sending task result for task 02083763-bbaf-c5a6-f857-000000000010 11044 1726853241.69412: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 11044 1726853241.69462: no more pending results, returning what we have 11044 1726853241.69466: results queue empty 11044 1726853241.69467: checking for any_errors_fatal 11044 1726853241.69556: done checking for any_errors_fatal 11044 1726853241.69558: checking for max_fail_percentage 11044 1726853241.69559: done checking for max_fail_percentage 11044 1726853241.69560: checking to see if all hosts have failed and the running result is not ok 11044 1726853241.69561: done checking to see if all hosts have failed 11044 1726853241.69562: getting the remaining hosts for this loop 11044 1726853241.69563: done getting the remaining hosts for this loop 11044 1726853241.69567: getting the next task for host managed_node1 11044 1726853241.69574: done getting next task for host managed_node1 11044 1726853241.69576: ^ task is: TASK: Install pgrep, sysctl 11044 1726853241.69579: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853241.69582: getting variables 11044 1726853241.69584: in VariableManager get_vars() 11044 1726853241.69623: Calling all_inventory to load vars for managed_node1 11044 1726853241.69626: Calling groups_inventory to load vars for managed_node1 11044 1726853241.69629: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853241.69638: Calling all_plugins_play to load vars for managed_node1 11044 1726853241.69641: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853241.69644: Calling groups_plugins_play to load vars for managed_node1 11044 1726853241.69964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853241.70173: done with get_vars() 11044 1726853241.70182: done getting variables 11044 1726853241.70241: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 13:27:21 -0400 (0:00:00.036) 0:00:06.078 ****** 11044 1726853241.70268: entering _queue_task() for managed_node1/package 11044 1726853241.70513: worker is 1 (out of 1 available) 11044 1726853241.70638: exiting _queue_task() for managed_node1/package 11044 1726853241.70649: done queuing things up, now waiting for results queue to drain 11044 1726853241.70650: waiting for pending results... 11044 1726853241.70865: running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl 11044 1726853241.70896: in run() - task 02083763-bbaf-c5a6-f857-000000000011 11044 1726853241.70913: variable 'ansible_search_path' from source: unknown 11044 1726853241.70920: variable 'ansible_search_path' from source: unknown 11044 1726853241.70956: calling self._execute() 11044 1726853241.71041: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853241.71074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853241.71077: variable 'omit' from source: magic vars 11044 1726853241.71428: variable 'ansible_distribution_major_version' from source: facts 11044 1726853241.71444: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853241.71612: variable 'ansible_os_family' from source: facts 11044 1726853241.71616: Evaluated conditional (ansible_os_family == 'RedHat'): True 11044 1726853241.71747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853241.72015: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853241.72077: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853241.72116: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853241.72155: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853241.72235: variable 'ansible_distribution_major_version' from source: facts 11044 1726853241.72251: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 11044 1726853241.72277: variable 'omit' from source: magic vars 11044 1726853241.72317: variable 'omit' from source: magic vars 11044 1726853241.72478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853241.74534: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853241.74553: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853241.74597: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853241.74633: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853241.74667: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853241.74763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853241.74798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853241.74829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853241.74968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853241.74973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853241.74994: variable '__network_is_ostree' from source: set_fact 11044 1726853241.75003: variable 'omit' from source: magic vars 11044 1726853241.75035: variable 'omit' from source: magic vars 11044 1726853241.75063: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853241.75129: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853241.75151: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853241.75174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853241.75192: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853241.75227: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853241.75235: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853241.75241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853241.75345: Set connection var ansible_timeout to 10 11044 1726853241.75402: Set connection var ansible_shell_executable to /bin/sh 11044 1726853241.75405: Set connection var ansible_shell_type to sh 11044 1726853241.75407: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853241.75409: Set connection var ansible_connection to ssh 11044 1726853241.75411: Set connection var ansible_pipelining to False 11044 1726853241.75429: variable 'ansible_shell_executable' from source: unknown 11044 1726853241.75436: variable 'ansible_connection' from source: unknown 11044 1726853241.75442: variable 'ansible_module_compression' from source: unknown 11044 1726853241.75448: variable 'ansible_shell_type' from source: unknown 11044 1726853241.75454: variable 'ansible_shell_executable' from source: unknown 11044 1726853241.75459: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853241.75510: variable 'ansible_pipelining' from source: unknown 11044 1726853241.75513: variable 'ansible_timeout' from source: unknown 11044 1726853241.75515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853241.75583: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853241.75596: variable 'omit' from source: magic vars 11044 1726853241.75605: starting attempt loop 11044 1726853241.75611: running the handler 11044 1726853241.75627: variable 'ansible_facts' from source: unknown 11044 1726853241.75637: variable 'ansible_facts' from source: unknown 11044 1726853241.75673: _low_level_execute_command(): starting 11044 1726853241.75727: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853241.76388: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853241.76405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853241.76491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853241.76533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853241.76548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853241.76570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853241.76652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853241.78330: stdout chunk (state=3): >>>/root <<< 11044 1726853241.78495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853241.78499: stdout chunk (state=3): >>><<< 11044 1726853241.78501: stderr chunk (state=3): >>><<< 11044 1726853241.78525: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853241.78543: _low_level_execute_command(): starting 11044 1726853241.78629: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853241.785324-11430-22901890367471 `" && echo ansible-tmp-1726853241.785324-11430-22901890367471="` echo /root/.ansible/tmp/ansible-tmp-1726853241.785324-11430-22901890367471 `" ) && sleep 0' 11044 1726853241.79198: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853241.79213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853241.79226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853241.79317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853241.79346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853241.79363: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853241.79387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853241.79451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853241.81337: stdout chunk (state=3): >>>ansible-tmp-1726853241.785324-11430-22901890367471=/root/.ansible/tmp/ansible-tmp-1726853241.785324-11430-22901890367471 <<< 11044 1726853241.81493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853241.81502: stdout chunk (state=3): >>><<< 11044 1726853241.81520: stderr chunk (state=3): >>><<< 11044 1726853241.81676: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853241.785324-11430-22901890367471=/root/.ansible/tmp/ansible-tmp-1726853241.785324-11430-22901890367471 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853241.81680: variable 'ansible_module_compression' from source: unknown 11044 1726853241.81683: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11044 1726853241.81693: variable 'ansible_facts' from source: unknown 11044 1726853241.81834: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853241.785324-11430-22901890367471/AnsiballZ_dnf.py 11044 1726853241.82040: Sending initial data 11044 1726853241.82043: Sent initial data (150 bytes) 11044 1726853241.82773: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853241.82777: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853241.82802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853241.82817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853241.82883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853241.84426: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853241.84524: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853241.84557: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpv076p5e1 /root/.ansible/tmp/ansible-tmp-1726853241.785324-11430-22901890367471/AnsiballZ_dnf.py <<< 11044 1726853241.84579: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853241.785324-11430-22901890367471/AnsiballZ_dnf.py" <<< 11044 1726853241.84703: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpv076p5e1" to remote "/root/.ansible/tmp/ansible-tmp-1726853241.785324-11430-22901890367471/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853241.785324-11430-22901890367471/AnsiballZ_dnf.py" <<< 11044 1726853241.86055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853241.86124: stderr chunk (state=3): >>><<< 11044 1726853241.86136: stdout chunk (state=3): >>><<< 11044 1726853241.86168: done transferring module to remote 11044 1726853241.86186: _low_level_execute_command(): starting 11044 1726853241.86195: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853241.785324-11430-22901890367471/ /root/.ansible/tmp/ansible-tmp-1726853241.785324-11430-22901890367471/AnsiballZ_dnf.py && sleep 0' 11044 1726853241.86814: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853241.86833: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853241.86848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853241.86884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853241.86901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853241.86941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853241.87007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853241.87057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853241.87076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853241.87103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853241.88900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853241.88911: stdout chunk (state=3): >>><<< 11044 1726853241.89015: stderr chunk (state=3): >>><<< 11044 1726853241.89019: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853241.89023: _low_level_execute_command(): starting 11044 1726853241.89026: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853241.785324-11430-22901890367471/AnsiballZ_dnf.py && sleep 0' 11044 1726853241.89575: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853241.89591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853241.89611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853241.89628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853241.89645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853241.89691: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853241.89759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853241.89779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853241.89805: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853241.89881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853242.30564: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11044 1726853242.34576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853242.34601: stdout chunk (state=3): >>><<< 11044 1726853242.34629: stderr chunk (state=3): >>><<< 11044 1726853242.34649: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853242.34702: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853241.785324-11430-22901890367471/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853242.34708: _low_level_execute_command(): starting 11044 1726853242.34713: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853241.785324-11430-22901890367471/ > /dev/null 2>&1 && sleep 0' 11044 1726853242.35146: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853242.35150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853242.35152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853242.35154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853242.35199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853242.35202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853242.35250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853242.37109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853242.37148: stderr chunk (state=3): >>><<< 11044 1726853242.37152: stdout chunk (state=3): >>><<< 11044 1726853242.37377: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853242.37380: handler run complete 11044 1726853242.37383: attempt loop complete, returning result 11044 1726853242.37385: _execute() done 11044 1726853242.37388: dumping result to json 11044 1726853242.37390: done dumping result, returning 11044 1726853242.37392: done running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl [02083763-bbaf-c5a6-f857-000000000011] 11044 1726853242.37394: sending task result for task 02083763-bbaf-c5a6-f857-000000000011 11044 1726853242.37481: done sending task result for task 02083763-bbaf-c5a6-f857-000000000011 ok: [managed_node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11044 1726853242.37582: no more pending results, returning what we have 11044 1726853242.37585: results queue empty 11044 1726853242.37586: checking for any_errors_fatal 11044 1726853242.37591: done checking for any_errors_fatal 11044 1726853242.37592: checking for max_fail_percentage 11044 1726853242.37593: done checking for max_fail_percentage 11044 1726853242.37594: checking to see if all hosts have failed and the running result is not ok 11044 1726853242.37595: done checking to see if all hosts have failed 11044 1726853242.37595: getting the remaining hosts for this loop 11044 1726853242.37596: done getting the remaining hosts for this loop 11044 1726853242.37601: getting the next task for host managed_node1 11044 1726853242.37606: done getting next task for host managed_node1 11044 1726853242.37608: ^ task is: TASK: Create test interfaces 11044 1726853242.37611: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853242.37615: getting variables 11044 1726853242.37616: in VariableManager get_vars() 11044 1726853242.37653: Calling all_inventory to load vars for managed_node1 11044 1726853242.37656: Calling groups_inventory to load vars for managed_node1 11044 1726853242.37658: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853242.37664: WORKER PROCESS EXITING 11044 1726853242.37697: Calling all_plugins_play to load vars for managed_node1 11044 1726853242.37700: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853242.37702: Calling groups_plugins_play to load vars for managed_node1 11044 1726853242.37813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853242.37931: done with get_vars() 11044 1726853242.37939: done getting variables 11044 1726853242.38010: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 13:27:22 -0400 (0:00:00.677) 0:00:06.756 ****** 11044 1726853242.38033: entering _queue_task() for managed_node1/shell 11044 1726853242.38034: Creating lock for shell 11044 1726853242.38245: worker is 1 (out of 1 available) 11044 1726853242.38256: exiting _queue_task() for managed_node1/shell 11044 1726853242.38268: done queuing things up, now waiting for results queue to drain 11044 1726853242.38269: waiting for pending results... 11044 1726853242.38423: running TaskExecutor() for managed_node1/TASK: Create test interfaces 11044 1726853242.38492: in run() - task 02083763-bbaf-c5a6-f857-000000000012 11044 1726853242.38506: variable 'ansible_search_path' from source: unknown 11044 1726853242.38510: variable 'ansible_search_path' from source: unknown 11044 1726853242.38535: calling self._execute() 11044 1726853242.38598: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853242.38602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853242.38611: variable 'omit' from source: magic vars 11044 1726853242.38938: variable 'ansible_distribution_major_version' from source: facts 11044 1726853242.38957: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853242.38962: variable 'omit' from source: magic vars 11044 1726853242.38997: variable 'omit' from source: magic vars 11044 1726853242.39245: variable 'dhcp_interface1' from source: play vars 11044 1726853242.39253: variable 'dhcp_interface2' from source: play vars 11044 1726853242.39375: variable 'omit' from source: magic vars 11044 1726853242.39379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853242.39381: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853242.39400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853242.39423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853242.39439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853242.39478: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853242.39486: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853242.39495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853242.39596: Set connection var ansible_timeout to 10 11044 1726853242.39611: Set connection var ansible_shell_executable to /bin/sh 11044 1726853242.39618: Set connection var ansible_shell_type to sh 11044 1726853242.39627: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853242.39636: Set connection var ansible_connection to ssh 11044 1726853242.39648: Set connection var ansible_pipelining to False 11044 1726853242.39678: variable 'ansible_shell_executable' from source: unknown 11044 1726853242.39686: variable 'ansible_connection' from source: unknown 11044 1726853242.39694: variable 'ansible_module_compression' from source: unknown 11044 1726853242.39701: variable 'ansible_shell_type' from source: unknown 11044 1726853242.39706: variable 'ansible_shell_executable' from source: unknown 11044 1726853242.39776: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853242.39779: variable 'ansible_pipelining' from source: unknown 11044 1726853242.39781: variable 'ansible_timeout' from source: unknown 11044 1726853242.39784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853242.39879: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853242.39894: variable 'omit' from source: magic vars 11044 1726853242.39904: starting attempt loop 11044 1726853242.39913: running the handler 11044 1726853242.39926: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853242.39955: _low_level_execute_command(): starting 11044 1726853242.40021: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853242.41173: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853242.41191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853242.41207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853242.41226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853242.41328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853242.41446: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853242.41612: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853242.41698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853242.41713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853242.41846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853242.43464: stdout chunk (state=3): >>>/root <<< 11044 1726853242.43685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853242.43741: stderr chunk (state=3): >>><<< 11044 1726853242.43755: stdout chunk (state=3): >>><<< 11044 1726853242.43891: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853242.43895: _low_level_execute_command(): starting 11044 1726853242.43899: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853242.437902-11467-199656231104479 `" && echo ansible-tmp-1726853242.437902-11467-199656231104479="` echo /root/.ansible/tmp/ansible-tmp-1726853242.437902-11467-199656231104479 `" ) && sleep 0' 11044 1726853242.44485: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853242.44499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853242.44514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853242.44624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853242.44648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853242.44717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853242.46586: stdout chunk (state=3): >>>ansible-tmp-1726853242.437902-11467-199656231104479=/root/.ansible/tmp/ansible-tmp-1726853242.437902-11467-199656231104479 <<< 11044 1726853242.46787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853242.46790: stdout chunk (state=3): >>><<< 11044 1726853242.46792: stderr chunk (state=3): >>><<< 11044 1726853242.46809: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853242.437902-11467-199656231104479=/root/.ansible/tmp/ansible-tmp-1726853242.437902-11467-199656231104479 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853242.46985: variable 'ansible_module_compression' from source: unknown 11044 1726853242.46988: ANSIBALLZ: Using generic lock for ansible.legacy.command 11044 1726853242.46990: ANSIBALLZ: Acquiring lock 11044 1726853242.46992: ANSIBALLZ: Lock acquired: 140360202229168 11044 1726853242.46994: ANSIBALLZ: Creating module 11044 1726853242.60310: ANSIBALLZ: Writing module into payload 11044 1726853242.60393: ANSIBALLZ: Writing module 11044 1726853242.60410: ANSIBALLZ: Renaming module 11044 1726853242.60424: ANSIBALLZ: Done creating module 11044 1726853242.60449: variable 'ansible_facts' from source: unknown 11044 1726853242.60518: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853242.437902-11467-199656231104479/AnsiballZ_command.py 11044 1726853242.60609: Sending initial data 11044 1726853242.60612: Sent initial data (155 bytes) 11044 1726853242.61310: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853242.61320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853242.61333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853242.61386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853242.61430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853242.63060: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853242.63103: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853242.63138: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmp_yh9qm0f /root/.ansible/tmp/ansible-tmp-1726853242.437902-11467-199656231104479/AnsiballZ_command.py <<< 11044 1726853242.63141: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853242.437902-11467-199656231104479/AnsiballZ_command.py" <<< 11044 1726853242.63179: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmp_yh9qm0f" to remote "/root/.ansible/tmp/ansible-tmp-1726853242.437902-11467-199656231104479/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853242.437902-11467-199656231104479/AnsiballZ_command.py" <<< 11044 1726853242.63804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853242.63833: stderr chunk (state=3): >>><<< 11044 1726853242.63837: stdout chunk (state=3): >>><<< 11044 1726853242.63927: done transferring module to remote 11044 1726853242.63942: _low_level_execute_command(): starting 11044 1726853242.63950: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853242.437902-11467-199656231104479/ /root/.ansible/tmp/ansible-tmp-1726853242.437902-11467-199656231104479/AnsiballZ_command.py && sleep 0' 11044 1726853242.64595: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853242.64599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853242.64612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853242.64629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853242.64641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853242.64649: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853242.64657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853242.64674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853242.64759: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853242.64877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853242.64881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853242.66660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853242.66663: stdout chunk (state=3): >>><<< 11044 1726853242.66665: stderr chunk (state=3): >>><<< 11044 1726853242.66679: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853242.66682: _low_level_execute_command(): starting 11044 1726853242.66688: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853242.437902-11467-199656231104479/AnsiballZ_command.py && sleep 0' 11044 1726853242.67175: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853242.67178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853242.67181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853242.67183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853242.67240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853242.67248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853242.67301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853244.04555: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 702 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 702 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/<<< 11044 1726853244.04561: stdout chunk (state=3): >>>show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 13:27:22.824031", "end": "2024-09-20 13:27:24.044136", "delta": "0:00:01.220105", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11044 1726853244.06204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853244.06208: stdout chunk (state=3): >>><<< 11044 1726853244.06211: stderr chunk (state=3): >>><<< 11044 1726853244.06378: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 702 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 702 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 13:27:22.824031", "end": "2024-09-20 13:27:24.044136", "delta": "0:00:01.220105", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853244.06388: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853242.437902-11467-199656231104479/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853244.06391: _low_level_execute_command(): starting 11044 1726853244.06393: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853242.437902-11467-199656231104479/ > /dev/null 2>&1 && sleep 0' 11044 1726853244.06991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853244.07004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853244.07017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853244.07066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853244.07129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853244.07153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853244.07196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853244.07304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853244.09229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853244.09253: stdout chunk (state=3): >>><<< 11044 1726853244.09256: stderr chunk (state=3): >>><<< 11044 1726853244.09275: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853244.09481: handler run complete 11044 1726853244.09485: Evaluated conditional (False): False 11044 1726853244.09487: attempt loop complete, returning result 11044 1726853244.09490: _execute() done 11044 1726853244.09492: dumping result to json 11044 1726853244.09494: done dumping result, returning 11044 1726853244.09496: done running TaskExecutor() for managed_node1/TASK: Create test interfaces [02083763-bbaf-c5a6-f857-000000000012] 11044 1726853244.09498: sending task result for task 02083763-bbaf-c5a6-f857-000000000012 11044 1726853244.09567: done sending task result for task 02083763-bbaf-c5a6-f857-000000000012 11044 1726853244.09570: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.220105", "end": "2024-09-20 13:27:24.044136", "rc": 0, "start": "2024-09-20 13:27:22.824031" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 702 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 702 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 11044 1726853244.09650: no more pending results, returning what we have 11044 1726853244.09653: results queue empty 11044 1726853244.09654: checking for any_errors_fatal 11044 1726853244.09660: done checking for any_errors_fatal 11044 1726853244.09661: checking for max_fail_percentage 11044 1726853244.09663: done checking for max_fail_percentage 11044 1726853244.09663: checking to see if all hosts have failed and the running result is not ok 11044 1726853244.09664: done checking to see if all hosts have failed 11044 1726853244.09665: getting the remaining hosts for this loop 11044 1726853244.09666: done getting the remaining hosts for this loop 11044 1726853244.09669: getting the next task for host managed_node1 11044 1726853244.09880: done getting next task for host managed_node1 11044 1726853244.09884: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11044 1726853244.09887: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853244.09890: getting variables 11044 1726853244.09892: in VariableManager get_vars() 11044 1726853244.09924: Calling all_inventory to load vars for managed_node1 11044 1726853244.09927: Calling groups_inventory to load vars for managed_node1 11044 1726853244.09929: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853244.09938: Calling all_plugins_play to load vars for managed_node1 11044 1726853244.09940: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853244.09943: Calling groups_plugins_play to load vars for managed_node1 11044 1726853244.10217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853244.10409: done with get_vars() 11044 1726853244.10418: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:27:24 -0400 (0:00:01.724) 0:00:08.480 ****** 11044 1726853244.10511: entering _queue_task() for managed_node1/include_tasks 11044 1726853244.10768: worker is 1 (out of 1 available) 11044 1726853244.10783: exiting _queue_task() for managed_node1/include_tasks 11044 1726853244.10796: done queuing things up, now waiting for results queue to drain 11044 1726853244.10797: waiting for pending results... 11044 1726853244.11044: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 11044 1726853244.11152: in run() - task 02083763-bbaf-c5a6-f857-000000000016 11044 1726853244.11180: variable 'ansible_search_path' from source: unknown 11044 1726853244.11188: variable 'ansible_search_path' from source: unknown 11044 1726853244.11227: calling self._execute() 11044 1726853244.11316: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853244.11326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853244.11336: variable 'omit' from source: magic vars 11044 1726853244.11706: variable 'ansible_distribution_major_version' from source: facts 11044 1726853244.11721: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853244.11775: _execute() done 11044 1726853244.11778: dumping result to json 11044 1726853244.11781: done dumping result, returning 11044 1726853244.11784: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-c5a6-f857-000000000016] 11044 1726853244.11786: sending task result for task 02083763-bbaf-c5a6-f857-000000000016 11044 1726853244.11979: done sending task result for task 02083763-bbaf-c5a6-f857-000000000016 11044 1726853244.11982: WORKER PROCESS EXITING 11044 1726853244.12006: no more pending results, returning what we have 11044 1726853244.12010: in VariableManager get_vars() 11044 1726853244.12054: Calling all_inventory to load vars for managed_node1 11044 1726853244.12057: Calling groups_inventory to load vars for managed_node1 11044 1726853244.12059: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853244.12070: Calling all_plugins_play to load vars for managed_node1 11044 1726853244.12075: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853244.12077: Calling groups_plugins_play to load vars for managed_node1 11044 1726853244.12333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853244.12512: done with get_vars() 11044 1726853244.12519: variable 'ansible_search_path' from source: unknown 11044 1726853244.12520: variable 'ansible_search_path' from source: unknown 11044 1726853244.12553: we have included files to process 11044 1726853244.12554: generating all_blocks data 11044 1726853244.12555: done generating all_blocks data 11044 1726853244.12556: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11044 1726853244.12557: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11044 1726853244.12559: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11044 1726853244.12792: done processing included file 11044 1726853244.12793: iterating over new_blocks loaded from include file 11044 1726853244.12795: in VariableManager get_vars() 11044 1726853244.12815: done with get_vars() 11044 1726853244.12816: filtering new block on tags 11044 1726853244.12833: done filtering new block on tags 11044 1726853244.12836: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 11044 1726853244.12875: extending task lists for all hosts with included blocks 11044 1726853244.12976: done extending task lists 11044 1726853244.12977: done processing included files 11044 1726853244.12978: results queue empty 11044 1726853244.12979: checking for any_errors_fatal 11044 1726853244.12984: done checking for any_errors_fatal 11044 1726853244.12985: checking for max_fail_percentage 11044 1726853244.12986: done checking for max_fail_percentage 11044 1726853244.12986: checking to see if all hosts have failed and the running result is not ok 11044 1726853244.12987: done checking to see if all hosts have failed 11044 1726853244.12988: getting the remaining hosts for this loop 11044 1726853244.12989: done getting the remaining hosts for this loop 11044 1726853244.12991: getting the next task for host managed_node1 11044 1726853244.12996: done getting next task for host managed_node1 11044 1726853244.12998: ^ task is: TASK: Get stat for interface {{ interface }} 11044 1726853244.13001: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853244.13004: getting variables 11044 1726853244.13005: in VariableManager get_vars() 11044 1726853244.13017: Calling all_inventory to load vars for managed_node1 11044 1726853244.13020: Calling groups_inventory to load vars for managed_node1 11044 1726853244.13022: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853244.13027: Calling all_plugins_play to load vars for managed_node1 11044 1726853244.13030: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853244.13033: Calling groups_plugins_play to load vars for managed_node1 11044 1726853244.13162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853244.13342: done with get_vars() 11044 1726853244.13351: done getting variables 11044 1726853244.13513: variable 'interface' from source: task vars 11044 1726853244.13518: variable 'dhcp_interface1' from source: play vars 11044 1726853244.13584: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:27:24 -0400 (0:00:00.031) 0:00:08.511 ****** 11044 1726853244.13624: entering _queue_task() for managed_node1/stat 11044 1726853244.14093: worker is 1 (out of 1 available) 11044 1726853244.14104: exiting _queue_task() for managed_node1/stat 11044 1726853244.14114: done queuing things up, now waiting for results queue to drain 11044 1726853244.14115: waiting for pending results... 11044 1726853244.14191: running TaskExecutor() for managed_node1/TASK: Get stat for interface test1 11044 1726853244.14318: in run() - task 02083763-bbaf-c5a6-f857-000000000153 11044 1726853244.14342: variable 'ansible_search_path' from source: unknown 11044 1726853244.14350: variable 'ansible_search_path' from source: unknown 11044 1726853244.14394: calling self._execute() 11044 1726853244.14480: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853244.14494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853244.14558: variable 'omit' from source: magic vars 11044 1726853244.14859: variable 'ansible_distribution_major_version' from source: facts 11044 1726853244.14882: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853244.14896: variable 'omit' from source: magic vars 11044 1726853244.14947: variable 'omit' from source: magic vars 11044 1726853244.15050: variable 'interface' from source: task vars 11044 1726853244.15061: variable 'dhcp_interface1' from source: play vars 11044 1726853244.15129: variable 'dhcp_interface1' from source: play vars 11044 1726853244.15150: variable 'omit' from source: magic vars 11044 1726853244.15215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853244.15229: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853244.15251: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853244.15273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853244.15289: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853244.15375: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853244.15379: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853244.15381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853244.15433: Set connection var ansible_timeout to 10 11044 1726853244.15446: Set connection var ansible_shell_executable to /bin/sh 11044 1726853244.15452: Set connection var ansible_shell_type to sh 11044 1726853244.15459: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853244.15467: Set connection var ansible_connection to ssh 11044 1726853244.15477: Set connection var ansible_pipelining to False 11044 1726853244.15501: variable 'ansible_shell_executable' from source: unknown 11044 1726853244.15507: variable 'ansible_connection' from source: unknown 11044 1726853244.15512: variable 'ansible_module_compression' from source: unknown 11044 1726853244.15518: variable 'ansible_shell_type' from source: unknown 11044 1726853244.15523: variable 'ansible_shell_executable' from source: unknown 11044 1726853244.15528: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853244.15535: variable 'ansible_pipelining' from source: unknown 11044 1726853244.15576: variable 'ansible_timeout' from source: unknown 11044 1726853244.15579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853244.15733: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11044 1726853244.15746: variable 'omit' from source: magic vars 11044 1726853244.15760: starting attempt loop 11044 1726853244.15766: running the handler 11044 1726853244.15785: _low_level_execute_command(): starting 11044 1726853244.15799: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853244.16535: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11044 1726853244.16577: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 11044 1726853244.16591: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853244.16664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853244.16680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853244.16765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853244.18461: stdout chunk (state=3): >>>/root <<< 11044 1726853244.18678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853244.18681: stdout chunk (state=3): >>><<< 11044 1726853244.18684: stderr chunk (state=3): >>><<< 11044 1726853244.18686: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853244.18689: _low_level_execute_command(): starting 11044 1726853244.18692: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853244.1861928-11539-256841411941778 `" && echo ansible-tmp-1726853244.1861928-11539-256841411941778="` echo /root/.ansible/tmp/ansible-tmp-1726853244.1861928-11539-256841411941778 `" ) && sleep 0' 11044 1726853244.19321: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853244.19367: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853244.19475: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853244.19479: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853244.19495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853244.19516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853244.19580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853244.21535: stdout chunk (state=3): >>>ansible-tmp-1726853244.1861928-11539-256841411941778=/root/.ansible/tmp/ansible-tmp-1726853244.1861928-11539-256841411941778 <<< 11044 1726853244.21803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853244.21807: stdout chunk (state=3): >>><<< 11044 1726853244.21809: stderr chunk (state=3): >>><<< 11044 1726853244.21811: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853244.1861928-11539-256841411941778=/root/.ansible/tmp/ansible-tmp-1726853244.1861928-11539-256841411941778 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853244.21814: variable 'ansible_module_compression' from source: unknown 11044 1726853244.21822: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11044 1726853244.21860: variable 'ansible_facts' from source: unknown 11044 1726853244.21953: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853244.1861928-11539-256841411941778/AnsiballZ_stat.py 11044 1726853244.22187: Sending initial data 11044 1726853244.22191: Sent initial data (153 bytes) 11044 1726853244.23494: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853244.23509: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853244.23521: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853244.23541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853244.23847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853244.25452: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853244.25491: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853244.25547: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpt35aysj_ /root/.ansible/tmp/ansible-tmp-1726853244.1861928-11539-256841411941778/AnsiballZ_stat.py <<< 11044 1726853244.25551: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853244.1861928-11539-256841411941778/AnsiballZ_stat.py" <<< 11044 1726853244.25582: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpt35aysj_" to remote "/root/.ansible/tmp/ansible-tmp-1726853244.1861928-11539-256841411941778/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853244.1861928-11539-256841411941778/AnsiballZ_stat.py" <<< 11044 1726853244.26947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853244.27116: stderr chunk (state=3): >>><<< 11044 1726853244.27120: stdout chunk (state=3): >>><<< 11044 1726853244.27122: done transferring module to remote 11044 1726853244.27125: _low_level_execute_command(): starting 11044 1726853244.27127: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853244.1861928-11539-256841411941778/ /root/.ansible/tmp/ansible-tmp-1726853244.1861928-11539-256841411941778/AnsiballZ_stat.py && sleep 0' 11044 1726853244.27947: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853244.27987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853244.28004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853244.28036: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853244.28090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853244.28155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853244.28191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853244.28377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853244.30247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853244.30251: stdout chunk (state=3): >>><<< 11044 1726853244.30255: stderr chunk (state=3): >>><<< 11044 1726853244.30258: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853244.30260: _low_level_execute_command(): starting 11044 1726853244.30262: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853244.1861928-11539-256841411941778/AnsiballZ_stat.py && sleep 0' 11044 1726853244.30836: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853244.30855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853244.30870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853244.30925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853244.30991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853244.31010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853244.31059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853244.31097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853244.46513: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26344, "dev": 23, "nlink": 1, "atime": 1726853242.8306735, "mtime": 1726853242.8306735, "ctime": 1726853242.8306735, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11044 1726853244.47961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853244.47965: stdout chunk (state=3): >>><<< 11044 1726853244.47968: stderr chunk (state=3): >>><<< 11044 1726853244.47992: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26344, "dev": 23, "nlink": 1, "atime": 1726853242.8306735, "mtime": 1726853242.8306735, "ctime": 1726853242.8306735, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853244.48114: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853244.1861928-11539-256841411941778/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853244.48128: _low_level_execute_command(): starting 11044 1726853244.48181: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853244.1861928-11539-256841411941778/ > /dev/null 2>&1 && sleep 0' 11044 1726853244.49003: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853244.49063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11044 1726853244.49067: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853244.49073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853244.49125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853244.49156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853244.51177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853244.51181: stdout chunk (state=3): >>><<< 11044 1726853244.51183: stderr chunk (state=3): >>><<< 11044 1726853244.51185: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853244.51188: handler run complete 11044 1726853244.51190: attempt loop complete, returning result 11044 1726853244.51192: _execute() done 11044 1726853244.51194: dumping result to json 11044 1726853244.51196: done dumping result, returning 11044 1726853244.51198: done running TaskExecutor() for managed_node1/TASK: Get stat for interface test1 [02083763-bbaf-c5a6-f857-000000000153] 11044 1726853244.51200: sending task result for task 02083763-bbaf-c5a6-f857-000000000153 11044 1726853244.51267: done sending task result for task 02083763-bbaf-c5a6-f857-000000000153 11044 1726853244.51270: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726853242.8306735, "block_size": 4096, "blocks": 0, "ctime": 1726853242.8306735, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26344, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726853242.8306735, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11044 1726853244.51360: no more pending results, returning what we have 11044 1726853244.51363: results queue empty 11044 1726853244.51364: checking for any_errors_fatal 11044 1726853244.51365: done checking for any_errors_fatal 11044 1726853244.51366: checking for max_fail_percentage 11044 1726853244.51367: done checking for max_fail_percentage 11044 1726853244.51368: checking to see if all hosts have failed and the running result is not ok 11044 1726853244.51369: done checking to see if all hosts have failed 11044 1726853244.51370: getting the remaining hosts for this loop 11044 1726853244.51606: done getting the remaining hosts for this loop 11044 1726853244.51611: getting the next task for host managed_node1 11044 1726853244.51618: done getting next task for host managed_node1 11044 1726853244.51621: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11044 1726853244.51624: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853244.51628: getting variables 11044 1726853244.51629: in VariableManager get_vars() 11044 1726853244.51666: Calling all_inventory to load vars for managed_node1 11044 1726853244.51668: Calling groups_inventory to load vars for managed_node1 11044 1726853244.51677: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853244.51688: Calling all_plugins_play to load vars for managed_node1 11044 1726853244.51691: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853244.51694: Calling groups_plugins_play to load vars for managed_node1 11044 1726853244.51935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853244.52143: done with get_vars() 11044 1726853244.52153: done getting variables 11044 1726853244.52258: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 11044 1726853244.52383: variable 'interface' from source: task vars 11044 1726853244.52387: variable 'dhcp_interface1' from source: play vars 11044 1726853244.52451: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:27:24 -0400 (0:00:00.388) 0:00:08.900 ****** 11044 1726853244.52679: entering _queue_task() for managed_node1/assert 11044 1726853244.52681: Creating lock for assert 11044 1726853244.52955: worker is 1 (out of 1 available) 11044 1726853244.52967: exiting _queue_task() for managed_node1/assert 11044 1726853244.53180: done queuing things up, now waiting for results queue to drain 11044 1726853244.53182: waiting for pending results... 11044 1726853244.53228: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test1' 11044 1726853244.53349: in run() - task 02083763-bbaf-c5a6-f857-000000000017 11044 1726853244.53373: variable 'ansible_search_path' from source: unknown 11044 1726853244.53406: variable 'ansible_search_path' from source: unknown 11044 1726853244.53419: calling self._execute() 11044 1726853244.53497: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853244.53517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853244.53576: variable 'omit' from source: magic vars 11044 1726853244.53916: variable 'ansible_distribution_major_version' from source: facts 11044 1726853244.53933: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853244.53949: variable 'omit' from source: magic vars 11044 1726853244.53998: variable 'omit' from source: magic vars 11044 1726853244.54108: variable 'interface' from source: task vars 11044 1726853244.54118: variable 'dhcp_interface1' from source: play vars 11044 1726853244.54278: variable 'dhcp_interface1' from source: play vars 11044 1726853244.54282: variable 'omit' from source: magic vars 11044 1726853244.54285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853244.54302: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853244.54329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853244.54355: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853244.54374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853244.54412: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853244.54421: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853244.54429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853244.54539: Set connection var ansible_timeout to 10 11044 1726853244.54558: Set connection var ansible_shell_executable to /bin/sh 11044 1726853244.54566: Set connection var ansible_shell_type to sh 11044 1726853244.54578: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853244.54589: Set connection var ansible_connection to ssh 11044 1726853244.54603: Set connection var ansible_pipelining to False 11044 1726853244.54632: variable 'ansible_shell_executable' from source: unknown 11044 1726853244.54641: variable 'ansible_connection' from source: unknown 11044 1726853244.54652: variable 'ansible_module_compression' from source: unknown 11044 1726853244.54712: variable 'ansible_shell_type' from source: unknown 11044 1726853244.54715: variable 'ansible_shell_executable' from source: unknown 11044 1726853244.54717: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853244.54719: variable 'ansible_pipelining' from source: unknown 11044 1726853244.54721: variable 'ansible_timeout' from source: unknown 11044 1726853244.54723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853244.54839: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853244.54860: variable 'omit' from source: magic vars 11044 1726853244.54870: starting attempt loop 11044 1726853244.54880: running the handler 11044 1726853244.55019: variable 'interface_stat' from source: set_fact 11044 1726853244.55058: Evaluated conditional (interface_stat.stat.exists): True 11044 1726853244.55149: handler run complete 11044 1726853244.55152: attempt loop complete, returning result 11044 1726853244.55154: _execute() done 11044 1726853244.55156: dumping result to json 11044 1726853244.55158: done dumping result, returning 11044 1726853244.55160: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test1' [02083763-bbaf-c5a6-f857-000000000017] 11044 1726853244.55162: sending task result for task 02083763-bbaf-c5a6-f857-000000000017 11044 1726853244.55233: done sending task result for task 02083763-bbaf-c5a6-f857-000000000017 11044 1726853244.55236: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11044 1726853244.55300: no more pending results, returning what we have 11044 1726853244.55304: results queue empty 11044 1726853244.55305: checking for any_errors_fatal 11044 1726853244.55315: done checking for any_errors_fatal 11044 1726853244.55316: checking for max_fail_percentage 11044 1726853244.55317: done checking for max_fail_percentage 11044 1726853244.55318: checking to see if all hosts have failed and the running result is not ok 11044 1726853244.55320: done checking to see if all hosts have failed 11044 1726853244.55321: getting the remaining hosts for this loop 11044 1726853244.55322: done getting the remaining hosts for this loop 11044 1726853244.55325: getting the next task for host managed_node1 11044 1726853244.55334: done getting next task for host managed_node1 11044 1726853244.55337: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11044 1726853244.55340: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853244.55343: getting variables 11044 1726853244.55348: in VariableManager get_vars() 11044 1726853244.55394: Calling all_inventory to load vars for managed_node1 11044 1726853244.55397: Calling groups_inventory to load vars for managed_node1 11044 1726853244.55400: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853244.55412: Calling all_plugins_play to load vars for managed_node1 11044 1726853244.55415: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853244.55419: Calling groups_plugins_play to load vars for managed_node1 11044 1726853244.55797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853244.56041: done with get_vars() 11044 1726853244.56053: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:27:24 -0400 (0:00:00.036) 0:00:08.937 ****** 11044 1726853244.56137: entering _queue_task() for managed_node1/include_tasks 11044 1726853244.56380: worker is 1 (out of 1 available) 11044 1726853244.56390: exiting _queue_task() for managed_node1/include_tasks 11044 1726853244.56401: done queuing things up, now waiting for results queue to drain 11044 1726853244.56402: waiting for pending results... 11044 1726853244.56790: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 11044 1726853244.56795: in run() - task 02083763-bbaf-c5a6-f857-00000000001b 11044 1726853244.56798: variable 'ansible_search_path' from source: unknown 11044 1726853244.56801: variable 'ansible_search_path' from source: unknown 11044 1726853244.56823: calling self._execute() 11044 1726853244.56911: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853244.56923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853244.56936: variable 'omit' from source: magic vars 11044 1726853244.57363: variable 'ansible_distribution_major_version' from source: facts 11044 1726853244.57381: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853244.57392: _execute() done 11044 1726853244.57401: dumping result to json 11044 1726853244.57409: done dumping result, returning 11044 1726853244.57419: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-c5a6-f857-00000000001b] 11044 1726853244.57434: sending task result for task 02083763-bbaf-c5a6-f857-00000000001b 11044 1726853244.57537: done sending task result for task 02083763-bbaf-c5a6-f857-00000000001b 11044 1726853244.57676: WORKER PROCESS EXITING 11044 1726853244.57702: no more pending results, returning what we have 11044 1726853244.57706: in VariableManager get_vars() 11044 1726853244.57757: Calling all_inventory to load vars for managed_node1 11044 1726853244.57760: Calling groups_inventory to load vars for managed_node1 11044 1726853244.57762: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853244.57776: Calling all_plugins_play to load vars for managed_node1 11044 1726853244.57779: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853244.57782: Calling groups_plugins_play to load vars for managed_node1 11044 1726853244.58063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853244.58250: done with get_vars() 11044 1726853244.58256: variable 'ansible_search_path' from source: unknown 11044 1726853244.58257: variable 'ansible_search_path' from source: unknown 11044 1726853244.58291: we have included files to process 11044 1726853244.58293: generating all_blocks data 11044 1726853244.58294: done generating all_blocks data 11044 1726853244.58298: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11044 1726853244.58299: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11044 1726853244.58301: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11044 1726853244.58469: done processing included file 11044 1726853244.58473: iterating over new_blocks loaded from include file 11044 1726853244.58474: in VariableManager get_vars() 11044 1726853244.58492: done with get_vars() 11044 1726853244.58494: filtering new block on tags 11044 1726853244.58509: done filtering new block on tags 11044 1726853244.58512: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 11044 1726853244.58517: extending task lists for all hosts with included blocks 11044 1726853244.58616: done extending task lists 11044 1726853244.58617: done processing included files 11044 1726853244.58618: results queue empty 11044 1726853244.58618: checking for any_errors_fatal 11044 1726853244.58620: done checking for any_errors_fatal 11044 1726853244.58622: checking for max_fail_percentage 11044 1726853244.58623: done checking for max_fail_percentage 11044 1726853244.58624: checking to see if all hosts have failed and the running result is not ok 11044 1726853244.58624: done checking to see if all hosts have failed 11044 1726853244.58625: getting the remaining hosts for this loop 11044 1726853244.58626: done getting the remaining hosts for this loop 11044 1726853244.58628: getting the next task for host managed_node1 11044 1726853244.58632: done getting next task for host managed_node1 11044 1726853244.58634: ^ task is: TASK: Get stat for interface {{ interface }} 11044 1726853244.58637: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853244.58639: getting variables 11044 1726853244.58640: in VariableManager get_vars() 11044 1726853244.58654: Calling all_inventory to load vars for managed_node1 11044 1726853244.58657: Calling groups_inventory to load vars for managed_node1 11044 1726853244.58659: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853244.58664: Calling all_plugins_play to load vars for managed_node1 11044 1726853244.58666: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853244.58669: Calling groups_plugins_play to load vars for managed_node1 11044 1726853244.58794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853244.59207: done with get_vars() 11044 1726853244.59216: done getting variables 11044 1726853244.59365: variable 'interface' from source: task vars 11044 1726853244.59370: variable 'dhcp_interface2' from source: play vars 11044 1726853244.59431: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:27:24 -0400 (0:00:00.033) 0:00:08.970 ****** 11044 1726853244.59463: entering _queue_task() for managed_node1/stat 11044 1726853244.59726: worker is 1 (out of 1 available) 11044 1726853244.59738: exiting _queue_task() for managed_node1/stat 11044 1726853244.59753: done queuing things up, now waiting for results queue to drain 11044 1726853244.59754: waiting for pending results... 11044 1726853244.60020: running TaskExecutor() for managed_node1/TASK: Get stat for interface test2 11044 1726853244.60160: in run() - task 02083763-bbaf-c5a6-f857-00000000016b 11044 1726853244.60184: variable 'ansible_search_path' from source: unknown 11044 1726853244.60197: variable 'ansible_search_path' from source: unknown 11044 1726853244.60237: calling self._execute() 11044 1726853244.60334: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853244.60352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853244.60367: variable 'omit' from source: magic vars 11044 1726853244.60976: variable 'ansible_distribution_major_version' from source: facts 11044 1726853244.60980: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853244.60983: variable 'omit' from source: magic vars 11044 1726853244.60985: variable 'omit' from source: magic vars 11044 1726853244.60988: variable 'interface' from source: task vars 11044 1726853244.60990: variable 'dhcp_interface2' from source: play vars 11044 1726853244.60992: variable 'dhcp_interface2' from source: play vars 11044 1726853244.61012: variable 'omit' from source: magic vars 11044 1726853244.61060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853244.61105: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853244.61133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853244.61160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853244.61178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853244.61208: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853244.61220: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853244.61228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853244.61320: Set connection var ansible_timeout to 10 11044 1726853244.61337: Set connection var ansible_shell_executable to /bin/sh 11044 1726853244.61343: Set connection var ansible_shell_type to sh 11044 1726853244.61354: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853244.61361: Set connection var ansible_connection to ssh 11044 1726853244.61437: Set connection var ansible_pipelining to False 11044 1726853244.61440: variable 'ansible_shell_executable' from source: unknown 11044 1726853244.61442: variable 'ansible_connection' from source: unknown 11044 1726853244.61446: variable 'ansible_module_compression' from source: unknown 11044 1726853244.61448: variable 'ansible_shell_type' from source: unknown 11044 1726853244.61450: variable 'ansible_shell_executable' from source: unknown 11044 1726853244.61452: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853244.61454: variable 'ansible_pipelining' from source: unknown 11044 1726853244.61456: variable 'ansible_timeout' from source: unknown 11044 1726853244.61458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853244.61615: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11044 1726853244.61629: variable 'omit' from source: magic vars 11044 1726853244.61638: starting attempt loop 11044 1726853244.61643: running the handler 11044 1726853244.61666: _low_level_execute_command(): starting 11044 1726853244.61678: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853244.62347: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853244.62356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853244.62413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853244.62430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853244.62491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853244.62508: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853244.62519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853244.62599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853244.64284: stdout chunk (state=3): >>>/root <<< 11044 1726853244.64427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853244.64440: stdout chunk (state=3): >>><<< 11044 1726853244.64458: stderr chunk (state=3): >>><<< 11044 1726853244.64490: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853244.64517: _low_level_execute_command(): starting 11044 1726853244.64530: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853244.6450315-11567-180471106454763 `" && echo ansible-tmp-1726853244.6450315-11567-180471106454763="` echo /root/.ansible/tmp/ansible-tmp-1726853244.6450315-11567-180471106454763 `" ) && sleep 0' 11044 1726853244.65195: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853244.65211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853244.65227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853244.65259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853244.65312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853244.65381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853244.65406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853244.65426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853244.65489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853244.67412: stdout chunk (state=3): >>>ansible-tmp-1726853244.6450315-11567-180471106454763=/root/.ansible/tmp/ansible-tmp-1726853244.6450315-11567-180471106454763 <<< 11044 1726853244.67554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853244.67591: stderr chunk (state=3): >>><<< 11044 1726853244.67595: stdout chunk (state=3): >>><<< 11044 1726853244.67614: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853244.6450315-11567-180471106454763=/root/.ansible/tmp/ansible-tmp-1726853244.6450315-11567-180471106454763 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853244.67776: variable 'ansible_module_compression' from source: unknown 11044 1726853244.67781: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11044 1726853244.67784: variable 'ansible_facts' from source: unknown 11044 1726853244.67886: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853244.6450315-11567-180471106454763/AnsiballZ_stat.py 11044 1726853244.68030: Sending initial data 11044 1726853244.68152: Sent initial data (153 bytes) 11044 1726853244.68686: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853244.68705: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853244.68788: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853244.68809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853244.68830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853244.68856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853244.68934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853244.70475: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853244.70700: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853244.70742: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpnbxwjyys /root/.ansible/tmp/ansible-tmp-1726853244.6450315-11567-180471106454763/AnsiballZ_stat.py <<< 11044 1726853244.70745: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853244.6450315-11567-180471106454763/AnsiballZ_stat.py" <<< 11044 1726853244.70799: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpnbxwjyys" to remote "/root/.ansible/tmp/ansible-tmp-1726853244.6450315-11567-180471106454763/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853244.6450315-11567-180471106454763/AnsiballZ_stat.py" <<< 11044 1726853244.71533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853244.71615: stderr chunk (state=3): >>><<< 11044 1726853244.71624: stdout chunk (state=3): >>><<< 11044 1726853244.71653: done transferring module to remote 11044 1726853244.71679: _low_level_execute_command(): starting 11044 1726853244.71689: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853244.6450315-11567-180471106454763/ /root/.ansible/tmp/ansible-tmp-1726853244.6450315-11567-180471106454763/AnsiballZ_stat.py && sleep 0' 11044 1726853244.72365: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853244.72383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853244.72400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853244.72422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853244.72449: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853244.72463: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853244.72557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853244.72589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853244.72612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853244.72679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853244.74499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853244.74513: stdout chunk (state=3): >>><<< 11044 1726853244.74531: stderr chunk (state=3): >>><<< 11044 1726853244.74557: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853244.74566: _low_level_execute_command(): starting 11044 1726853244.74579: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853244.6450315-11567-180471106454763/AnsiballZ_stat.py && sleep 0' 11044 1726853244.75451: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853244.75467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853244.75522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853244.75589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853244.75605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853244.75634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853244.75791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853244.91010: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26750, "dev": 23, "nlink": 1, "atime": 1726853242.8346274, "mtime": 1726853242.8346274, "ctime": 1726853242.8346274, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11044 1726853244.92419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853244.92423: stdout chunk (state=3): >>><<< 11044 1726853244.92426: stderr chunk (state=3): >>><<< 11044 1726853244.92577: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26750, "dev": 23, "nlink": 1, "atime": 1726853242.8346274, "mtime": 1726853242.8346274, "ctime": 1726853242.8346274, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853244.92581: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853244.6450315-11567-180471106454763/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853244.92588: _low_level_execute_command(): starting 11044 1726853244.92591: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853244.6450315-11567-180471106454763/ > /dev/null 2>&1 && sleep 0' 11044 1726853244.93208: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853244.93226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853244.93242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853244.93276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853244.93290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853244.93375: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853244.93403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853244.93429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853244.93442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853244.93604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853244.95579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853244.95583: stdout chunk (state=3): >>><<< 11044 1726853244.95585: stderr chunk (state=3): >>><<< 11044 1726853244.95588: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853244.95590: handler run complete 11044 1726853244.95592: attempt loop complete, returning result 11044 1726853244.95594: _execute() done 11044 1726853244.95596: dumping result to json 11044 1726853244.95597: done dumping result, returning 11044 1726853244.95621: done running TaskExecutor() for managed_node1/TASK: Get stat for interface test2 [02083763-bbaf-c5a6-f857-00000000016b] 11044 1726853244.95630: sending task result for task 02083763-bbaf-c5a6-f857-00000000016b 11044 1726853244.95796: done sending task result for task 02083763-bbaf-c5a6-f857-00000000016b 11044 1726853244.95800: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726853242.8346274, "block_size": 4096, "blocks": 0, "ctime": 1726853242.8346274, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26750, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726853242.8346274, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11044 1726853244.95914: no more pending results, returning what we have 11044 1726853244.95918: results queue empty 11044 1726853244.95919: checking for any_errors_fatal 11044 1726853244.95921: done checking for any_errors_fatal 11044 1726853244.95922: checking for max_fail_percentage 11044 1726853244.95923: done checking for max_fail_percentage 11044 1726853244.95924: checking to see if all hosts have failed and the running result is not ok 11044 1726853244.95925: done checking to see if all hosts have failed 11044 1726853244.95926: getting the remaining hosts for this loop 11044 1726853244.95927: done getting the remaining hosts for this loop 11044 1726853244.95930: getting the next task for host managed_node1 11044 1726853244.95939: done getting next task for host managed_node1 11044 1726853244.95941: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11044 1726853244.95944: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853244.95949: getting variables 11044 1726853244.95951: in VariableManager get_vars() 11044 1726853244.96104: Calling all_inventory to load vars for managed_node1 11044 1726853244.96107: Calling groups_inventory to load vars for managed_node1 11044 1726853244.96110: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853244.96123: Calling all_plugins_play to load vars for managed_node1 11044 1726853244.96126: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853244.96129: Calling groups_plugins_play to load vars for managed_node1 11044 1726853244.96488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853244.96702: done with get_vars() 11044 1726853244.96713: done getting variables 11044 1726853244.96775: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853244.96895: variable 'interface' from source: task vars 11044 1726853244.96898: variable 'dhcp_interface2' from source: play vars 11044 1726853244.96949: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:27:24 -0400 (0:00:00.375) 0:00:09.345 ****** 11044 1726853244.96990: entering _queue_task() for managed_node1/assert 11044 1726853244.97396: worker is 1 (out of 1 available) 11044 1726853244.97407: exiting _queue_task() for managed_node1/assert 11044 1726853244.97417: done queuing things up, now waiting for results queue to drain 11044 1726853244.97418: waiting for pending results... 11044 1726853244.97627: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test2' 11044 1726853244.97756: in run() - task 02083763-bbaf-c5a6-f857-00000000001c 11044 1726853244.97759: variable 'ansible_search_path' from source: unknown 11044 1726853244.97762: variable 'ansible_search_path' from source: unknown 11044 1726853244.97796: calling self._execute() 11044 1726853244.97941: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853244.97945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853244.97948: variable 'omit' from source: magic vars 11044 1726853244.98507: variable 'ansible_distribution_major_version' from source: facts 11044 1726853244.98532: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853244.98546: variable 'omit' from source: magic vars 11044 1726853244.98598: variable 'omit' from source: magic vars 11044 1726853244.98701: variable 'interface' from source: task vars 11044 1726853244.98712: variable 'dhcp_interface2' from source: play vars 11044 1726853244.98782: variable 'dhcp_interface2' from source: play vars 11044 1726853244.98805: variable 'omit' from source: magic vars 11044 1726853244.98851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853244.98896: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853244.98921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853244.98946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853244.98964: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853244.99001: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853244.99009: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853244.99018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853244.99125: Set connection var ansible_timeout to 10 11044 1726853244.99139: Set connection var ansible_shell_executable to /bin/sh 11044 1726853244.99148: Set connection var ansible_shell_type to sh 11044 1726853244.99160: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853244.99169: Set connection var ansible_connection to ssh 11044 1726853244.99179: Set connection var ansible_pipelining to False 11044 1726853244.99205: variable 'ansible_shell_executable' from source: unknown 11044 1726853244.99211: variable 'ansible_connection' from source: unknown 11044 1726853244.99216: variable 'ansible_module_compression' from source: unknown 11044 1726853244.99276: variable 'ansible_shell_type' from source: unknown 11044 1726853244.99279: variable 'ansible_shell_executable' from source: unknown 11044 1726853244.99281: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853244.99283: variable 'ansible_pipelining' from source: unknown 11044 1726853244.99285: variable 'ansible_timeout' from source: unknown 11044 1726853244.99287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853244.99390: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853244.99406: variable 'omit' from source: magic vars 11044 1726853244.99416: starting attempt loop 11044 1726853244.99424: running the handler 11044 1726853244.99573: variable 'interface_stat' from source: set_fact 11044 1726853244.99603: Evaluated conditional (interface_stat.stat.exists): True 11044 1726853244.99613: handler run complete 11044 1726853244.99691: attempt loop complete, returning result 11044 1726853244.99694: _execute() done 11044 1726853244.99696: dumping result to json 11044 1726853244.99699: done dumping result, returning 11044 1726853244.99701: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test2' [02083763-bbaf-c5a6-f857-00000000001c] 11044 1726853244.99703: sending task result for task 02083763-bbaf-c5a6-f857-00000000001c 11044 1726853244.99777: done sending task result for task 02083763-bbaf-c5a6-f857-00000000001c 11044 1726853244.99781: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11044 1726853244.99850: no more pending results, returning what we have 11044 1726853244.99853: results queue empty 11044 1726853244.99854: checking for any_errors_fatal 11044 1726853244.99866: done checking for any_errors_fatal 11044 1726853244.99867: checking for max_fail_percentage 11044 1726853244.99869: done checking for max_fail_percentage 11044 1726853244.99870: checking to see if all hosts have failed and the running result is not ok 11044 1726853244.99873: done checking to see if all hosts have failed 11044 1726853244.99874: getting the remaining hosts for this loop 11044 1726853244.99875: done getting the remaining hosts for this loop 11044 1726853244.99879: getting the next task for host managed_node1 11044 1726853244.99887: done getting next task for host managed_node1 11044 1726853244.99890: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 11044 1726853244.99892: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853244.99895: getting variables 11044 1726853244.99897: in VariableManager get_vars() 11044 1726853244.99949: Calling all_inventory to load vars for managed_node1 11044 1726853244.99952: Calling groups_inventory to load vars for managed_node1 11044 1726853244.99955: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853244.99969: Calling all_plugins_play to load vars for managed_node1 11044 1726853245.00076: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853245.00081: Calling groups_plugins_play to load vars for managed_node1 11044 1726853245.00514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853245.00717: done with get_vars() 11044 1726853245.00728: done getting variables 11044 1726853245.00796: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:28 Friday 20 September 2024 13:27:25 -0400 (0:00:00.038) 0:00:09.383 ****** 11044 1726853245.00824: entering _queue_task() for managed_node1/command 11044 1726853245.01128: worker is 1 (out of 1 available) 11044 1726853245.01140: exiting _queue_task() for managed_node1/command 11044 1726853245.01157: done queuing things up, now waiting for results queue to drain 11044 1726853245.01158: waiting for pending results... 11044 1726853245.01492: running TaskExecutor() for managed_node1/TASK: Backup the /etc/resolv.conf for initscript 11044 1726853245.01578: in run() - task 02083763-bbaf-c5a6-f857-00000000001d 11044 1726853245.01582: variable 'ansible_search_path' from source: unknown 11044 1726853245.01600: calling self._execute() 11044 1726853245.01697: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853245.01717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853245.01776: variable 'omit' from source: magic vars 11044 1726853245.02108: variable 'ansible_distribution_major_version' from source: facts 11044 1726853245.02124: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853245.02242: variable 'network_provider' from source: set_fact 11044 1726853245.02256: Evaluated conditional (network_provider == "initscripts"): False 11044 1726853245.02265: when evaluation is False, skipping this task 11044 1726853245.02272: _execute() done 11044 1726853245.02279: dumping result to json 11044 1726853245.02287: done dumping result, returning 11044 1726853245.02296: done running TaskExecutor() for managed_node1/TASK: Backup the /etc/resolv.conf for initscript [02083763-bbaf-c5a6-f857-00000000001d] 11044 1726853245.02369: sending task result for task 02083763-bbaf-c5a6-f857-00000000001d 11044 1726853245.02435: done sending task result for task 02083763-bbaf-c5a6-f857-00000000001d 11044 1726853245.02439: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11044 1726853245.02487: no more pending results, returning what we have 11044 1726853245.02491: results queue empty 11044 1726853245.02492: checking for any_errors_fatal 11044 1726853245.02499: done checking for any_errors_fatal 11044 1726853245.02500: checking for max_fail_percentage 11044 1726853245.02502: done checking for max_fail_percentage 11044 1726853245.02502: checking to see if all hosts have failed and the running result is not ok 11044 1726853245.02503: done checking to see if all hosts have failed 11044 1726853245.02504: getting the remaining hosts for this loop 11044 1726853245.02505: done getting the remaining hosts for this loop 11044 1726853245.02508: getting the next task for host managed_node1 11044 1726853245.02514: done getting next task for host managed_node1 11044 1726853245.02516: ^ task is: TASK: TEST Add Bond with 2 ports using deprecated 'master' argument 11044 1726853245.02519: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853245.02522: getting variables 11044 1726853245.02524: in VariableManager get_vars() 11044 1726853245.02567: Calling all_inventory to load vars for managed_node1 11044 1726853245.02570: Calling groups_inventory to load vars for managed_node1 11044 1726853245.02573: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853245.02586: Calling all_plugins_play to load vars for managed_node1 11044 1726853245.02589: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853245.02592: Calling groups_plugins_play to load vars for managed_node1 11044 1726853245.02976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853245.03208: done with get_vars() 11044 1726853245.03220: done getting variables 11044 1726853245.03285: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports using deprecated 'master' argument] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:33 Friday 20 September 2024 13:27:25 -0400 (0:00:00.024) 0:00:09.408 ****** 11044 1726853245.03312: entering _queue_task() for managed_node1/debug 11044 1726853245.03599: worker is 1 (out of 1 available) 11044 1726853245.03611: exiting _queue_task() for managed_node1/debug 11044 1726853245.03624: done queuing things up, now waiting for results queue to drain 11044 1726853245.03625: waiting for pending results... 11044 1726853245.03997: running TaskExecutor() for managed_node1/TASK: TEST Add Bond with 2 ports using deprecated 'master' argument 11044 1726853245.04002: in run() - task 02083763-bbaf-c5a6-f857-00000000001e 11044 1726853245.04005: variable 'ansible_search_path' from source: unknown 11044 1726853245.04043: calling self._execute() 11044 1726853245.04138: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853245.04154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853245.04168: variable 'omit' from source: magic vars 11044 1726853245.04622: variable 'ansible_distribution_major_version' from source: facts 11044 1726853245.04647: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853245.04659: variable 'omit' from source: magic vars 11044 1726853245.04685: variable 'omit' from source: magic vars 11044 1726853245.04725: variable 'omit' from source: magic vars 11044 1726853245.04776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853245.04817: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853245.04842: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853245.04870: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853245.04889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853245.04960: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853245.04963: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853245.04965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853245.05047: Set connection var ansible_timeout to 10 11044 1726853245.05065: Set connection var ansible_shell_executable to /bin/sh 11044 1726853245.05078: Set connection var ansible_shell_type to sh 11044 1726853245.05090: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853245.05100: Set connection var ansible_connection to ssh 11044 1726853245.05177: Set connection var ansible_pipelining to False 11044 1726853245.05180: variable 'ansible_shell_executable' from source: unknown 11044 1726853245.05183: variable 'ansible_connection' from source: unknown 11044 1726853245.05185: variable 'ansible_module_compression' from source: unknown 11044 1726853245.05187: variable 'ansible_shell_type' from source: unknown 11044 1726853245.05190: variable 'ansible_shell_executable' from source: unknown 11044 1726853245.05191: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853245.05193: variable 'ansible_pipelining' from source: unknown 11044 1726853245.05196: variable 'ansible_timeout' from source: unknown 11044 1726853245.05198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853245.05328: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853245.05343: variable 'omit' from source: magic vars 11044 1726853245.05357: starting attempt loop 11044 1726853245.05364: running the handler 11044 1726853245.05420: handler run complete 11044 1726853245.05443: attempt loop complete, returning result 11044 1726853245.05454: _execute() done 11044 1726853245.05475: dumping result to json 11044 1726853245.05478: done dumping result, returning 11044 1726853245.05480: done running TaskExecutor() for managed_node1/TASK: TEST Add Bond with 2 ports using deprecated 'master' argument [02083763-bbaf-c5a6-f857-00000000001e] 11044 1726853245.05485: sending task result for task 02083763-bbaf-c5a6-f857-00000000001e 11044 1726853245.05748: done sending task result for task 02083763-bbaf-c5a6-f857-00000000001e 11044 1726853245.05752: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: ################################################## 11044 1726853245.05802: no more pending results, returning what we have 11044 1726853245.05806: results queue empty 11044 1726853245.05807: checking for any_errors_fatal 11044 1726853245.05813: done checking for any_errors_fatal 11044 1726853245.05814: checking for max_fail_percentage 11044 1726853245.05816: done checking for max_fail_percentage 11044 1726853245.05817: checking to see if all hosts have failed and the running result is not ok 11044 1726853245.05817: done checking to see if all hosts have failed 11044 1726853245.05818: getting the remaining hosts for this loop 11044 1726853245.05819: done getting the remaining hosts for this loop 11044 1726853245.05824: getting the next task for host managed_node1 11044 1726853245.05831: done getting next task for host managed_node1 11044 1726853245.05837: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11044 1726853245.05840: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853245.05861: getting variables 11044 1726853245.05863: in VariableManager get_vars() 11044 1726853245.05909: Calling all_inventory to load vars for managed_node1 11044 1726853245.05912: Calling groups_inventory to load vars for managed_node1 11044 1726853245.05914: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853245.05924: Calling all_plugins_play to load vars for managed_node1 11044 1726853245.05927: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853245.05930: Calling groups_plugins_play to load vars for managed_node1 11044 1726853245.06292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853245.06497: done with get_vars() 11044 1726853245.06508: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:27:25 -0400 (0:00:00.032) 0:00:09.441 ****** 11044 1726853245.06603: entering _queue_task() for managed_node1/include_tasks 11044 1726853245.06899: worker is 1 (out of 1 available) 11044 1726853245.06914: exiting _queue_task() for managed_node1/include_tasks 11044 1726853245.06926: done queuing things up, now waiting for results queue to drain 11044 1726853245.06927: waiting for pending results... 11044 1726853245.07221: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11044 1726853245.07396: in run() - task 02083763-bbaf-c5a6-f857-000000000026 11044 1726853245.07400: variable 'ansible_search_path' from source: unknown 11044 1726853245.07402: variable 'ansible_search_path' from source: unknown 11044 1726853245.07434: calling self._execute() 11044 1726853245.07534: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853245.07776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853245.07780: variable 'omit' from source: magic vars 11044 1726853245.07951: variable 'ansible_distribution_major_version' from source: facts 11044 1726853245.07969: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853245.07982: _execute() done 11044 1726853245.07991: dumping result to json 11044 1726853245.08000: done dumping result, returning 11044 1726853245.08014: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-c5a6-f857-000000000026] 11044 1726853245.08115: sending task result for task 02083763-bbaf-c5a6-f857-000000000026 11044 1726853245.08197: done sending task result for task 02083763-bbaf-c5a6-f857-000000000026 11044 1726853245.08201: WORKER PROCESS EXITING 11044 1726853245.08265: no more pending results, returning what we have 11044 1726853245.08272: in VariableManager get_vars() 11044 1726853245.08325: Calling all_inventory to load vars for managed_node1 11044 1726853245.08328: Calling groups_inventory to load vars for managed_node1 11044 1726853245.08330: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853245.08347: Calling all_plugins_play to load vars for managed_node1 11044 1726853245.08350: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853245.08354: Calling groups_plugins_play to load vars for managed_node1 11044 1726853245.08562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853245.09029: done with get_vars() 11044 1726853245.09039: variable 'ansible_search_path' from source: unknown 11044 1726853245.09041: variable 'ansible_search_path' from source: unknown 11044 1726853245.09085: we have included files to process 11044 1726853245.09087: generating all_blocks data 11044 1726853245.09089: done generating all_blocks data 11044 1726853245.09094: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11044 1726853245.09095: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11044 1726853245.09097: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11044 1726853245.09734: done processing included file 11044 1726853245.09736: iterating over new_blocks loaded from include file 11044 1726853245.09737: in VariableManager get_vars() 11044 1726853245.09763: done with get_vars() 11044 1726853245.09765: filtering new block on tags 11044 1726853245.09782: done filtering new block on tags 11044 1726853245.09785: in VariableManager get_vars() 11044 1726853245.09806: done with get_vars() 11044 1726853245.09807: filtering new block on tags 11044 1726853245.09825: done filtering new block on tags 11044 1726853245.09827: in VariableManager get_vars() 11044 1726853245.09849: done with get_vars() 11044 1726853245.09851: filtering new block on tags 11044 1726853245.09866: done filtering new block on tags 11044 1726853245.09867: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 11044 1726853245.09873: extending task lists for all hosts with included blocks 11044 1726853245.10703: done extending task lists 11044 1726853245.10705: done processing included files 11044 1726853245.10705: results queue empty 11044 1726853245.10706: checking for any_errors_fatal 11044 1726853245.10710: done checking for any_errors_fatal 11044 1726853245.10711: checking for max_fail_percentage 11044 1726853245.10712: done checking for max_fail_percentage 11044 1726853245.10712: checking to see if all hosts have failed and the running result is not ok 11044 1726853245.10713: done checking to see if all hosts have failed 11044 1726853245.10714: getting the remaining hosts for this loop 11044 1726853245.10715: done getting the remaining hosts for this loop 11044 1726853245.10717: getting the next task for host managed_node1 11044 1726853245.10721: done getting next task for host managed_node1 11044 1726853245.10724: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11044 1726853245.10727: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853245.10736: getting variables 11044 1726853245.10737: in VariableManager get_vars() 11044 1726853245.10757: Calling all_inventory to load vars for managed_node1 11044 1726853245.10760: Calling groups_inventory to load vars for managed_node1 11044 1726853245.10762: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853245.10768: Calling all_plugins_play to load vars for managed_node1 11044 1726853245.10772: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853245.10775: Calling groups_plugins_play to load vars for managed_node1 11044 1726853245.10942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853245.11136: done with get_vars() 11044 1726853245.11148: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:27:25 -0400 (0:00:00.046) 0:00:09.487 ****** 11044 1726853245.11220: entering _queue_task() for managed_node1/setup 11044 1726853245.11549: worker is 1 (out of 1 available) 11044 1726853245.11561: exiting _queue_task() for managed_node1/setup 11044 1726853245.11774: done queuing things up, now waiting for results queue to drain 11044 1726853245.11776: waiting for pending results... 11044 1726853245.11904: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11044 1726853245.11988: in run() - task 02083763-bbaf-c5a6-f857-000000000189 11044 1726853245.12014: variable 'ansible_search_path' from source: unknown 11044 1726853245.12022: variable 'ansible_search_path' from source: unknown 11044 1726853245.12066: calling self._execute() 11044 1726853245.12156: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853245.12170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853245.12188: variable 'omit' from source: magic vars 11044 1726853245.12654: variable 'ansible_distribution_major_version' from source: facts 11044 1726853245.12658: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853245.12827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853245.15028: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853245.15102: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853245.15138: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853245.15183: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853245.15222: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853245.15303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853245.15332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853245.15367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853245.15417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853245.15440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853245.15676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853245.15680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853245.15683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853245.15686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853245.15688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853245.15779: variable '__network_required_facts' from source: role '' defaults 11044 1726853245.15794: variable 'ansible_facts' from source: unknown 11044 1726853245.15892: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11044 1726853245.15901: when evaluation is False, skipping this task 11044 1726853245.15914: _execute() done 11044 1726853245.15922: dumping result to json 11044 1726853245.15931: done dumping result, returning 11044 1726853245.15947: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-c5a6-f857-000000000189] 11044 1726853245.15958: sending task result for task 02083763-bbaf-c5a6-f857-000000000189 11044 1726853245.16176: done sending task result for task 02083763-bbaf-c5a6-f857-000000000189 11044 1726853245.16179: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11044 1726853245.16225: no more pending results, returning what we have 11044 1726853245.16229: results queue empty 11044 1726853245.16230: checking for any_errors_fatal 11044 1726853245.16232: done checking for any_errors_fatal 11044 1726853245.16232: checking for max_fail_percentage 11044 1726853245.16234: done checking for max_fail_percentage 11044 1726853245.16235: checking to see if all hosts have failed and the running result is not ok 11044 1726853245.16236: done checking to see if all hosts have failed 11044 1726853245.16237: getting the remaining hosts for this loop 11044 1726853245.16238: done getting the remaining hosts for this loop 11044 1726853245.16241: getting the next task for host managed_node1 11044 1726853245.16253: done getting next task for host managed_node1 11044 1726853245.16257: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11044 1726853245.16262: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853245.16277: getting variables 11044 1726853245.16279: in VariableManager get_vars() 11044 1726853245.16323: Calling all_inventory to load vars for managed_node1 11044 1726853245.16327: Calling groups_inventory to load vars for managed_node1 11044 1726853245.16329: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853245.16341: Calling all_plugins_play to load vars for managed_node1 11044 1726853245.16347: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853245.16351: Calling groups_plugins_play to load vars for managed_node1 11044 1726853245.16741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853245.16962: done with get_vars() 11044 1726853245.16974: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:27:25 -0400 (0:00:00.058) 0:00:09.546 ****** 11044 1726853245.17075: entering _queue_task() for managed_node1/stat 11044 1726853245.17347: worker is 1 (out of 1 available) 11044 1726853245.17361: exiting _queue_task() for managed_node1/stat 11044 1726853245.17578: done queuing things up, now waiting for results queue to drain 11044 1726853245.17580: waiting for pending results... 11044 1726853245.17707: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 11044 1726853245.17792: in run() - task 02083763-bbaf-c5a6-f857-00000000018b 11044 1726853245.17818: variable 'ansible_search_path' from source: unknown 11044 1726853245.17828: variable 'ansible_search_path' from source: unknown 11044 1726853245.17875: calling self._execute() 11044 1726853245.17978: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853245.17991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853245.18005: variable 'omit' from source: magic vars 11044 1726853245.18401: variable 'ansible_distribution_major_version' from source: facts 11044 1726853245.18419: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853245.18677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853245.18902: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853245.18952: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853245.18994: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853245.19037: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853245.19222: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853245.19225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853245.19227: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853245.19229: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853245.19316: variable '__network_is_ostree' from source: set_fact 11044 1726853245.19333: Evaluated conditional (not __network_is_ostree is defined): False 11044 1726853245.19342: when evaluation is False, skipping this task 11044 1726853245.19352: _execute() done 11044 1726853245.19359: dumping result to json 11044 1726853245.19367: done dumping result, returning 11044 1726853245.19381: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-c5a6-f857-00000000018b] 11044 1726853245.19390: sending task result for task 02083763-bbaf-c5a6-f857-00000000018b skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11044 1726853245.19551: no more pending results, returning what we have 11044 1726853245.19554: results queue empty 11044 1726853245.19556: checking for any_errors_fatal 11044 1726853245.19564: done checking for any_errors_fatal 11044 1726853245.19565: checking for max_fail_percentage 11044 1726853245.19567: done checking for max_fail_percentage 11044 1726853245.19568: checking to see if all hosts have failed and the running result is not ok 11044 1726853245.19569: done checking to see if all hosts have failed 11044 1726853245.19570: getting the remaining hosts for this loop 11044 1726853245.19574: done getting the remaining hosts for this loop 11044 1726853245.19577: getting the next task for host managed_node1 11044 1726853245.19584: done getting next task for host managed_node1 11044 1726853245.19588: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11044 1726853245.19593: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853245.19607: getting variables 11044 1726853245.19609: in VariableManager get_vars() 11044 1726853245.19656: Calling all_inventory to load vars for managed_node1 11044 1726853245.19660: Calling groups_inventory to load vars for managed_node1 11044 1726853245.19663: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853245.19977: Calling all_plugins_play to load vars for managed_node1 11044 1726853245.19981: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853245.19988: done sending task result for task 02083763-bbaf-c5a6-f857-00000000018b 11044 1726853245.19990: WORKER PROCESS EXITING 11044 1726853245.19995: Calling groups_plugins_play to load vars for managed_node1 11044 1726853245.20168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853245.20380: done with get_vars() 11044 1726853245.20390: done getting variables 11044 1726853245.20440: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:27:25 -0400 (0:00:00.034) 0:00:09.580 ****** 11044 1726853245.20475: entering _queue_task() for managed_node1/set_fact 11044 1726853245.20735: worker is 1 (out of 1 available) 11044 1726853245.20748: exiting _queue_task() for managed_node1/set_fact 11044 1726853245.20760: done queuing things up, now waiting for results queue to drain 11044 1726853245.20762: waiting for pending results... 11044 1726853245.21188: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11044 1726853245.21193: in run() - task 02083763-bbaf-c5a6-f857-00000000018c 11044 1726853245.21196: variable 'ansible_search_path' from source: unknown 11044 1726853245.21199: variable 'ansible_search_path' from source: unknown 11044 1726853245.21239: calling self._execute() 11044 1726853245.21325: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853245.21337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853245.21355: variable 'omit' from source: magic vars 11044 1726853245.21727: variable 'ansible_distribution_major_version' from source: facts 11044 1726853245.21750: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853245.21922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853245.22283: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853245.22327: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853245.22363: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853245.22404: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853245.22489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853245.22523: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853245.22555: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853245.22588: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853245.22686: variable '__network_is_ostree' from source: set_fact 11044 1726853245.22728: Evaluated conditional (not __network_is_ostree is defined): False 11044 1726853245.22731: when evaluation is False, skipping this task 11044 1726853245.22733: _execute() done 11044 1726853245.22736: dumping result to json 11044 1726853245.22738: done dumping result, returning 11044 1726853245.22741: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-c5a6-f857-00000000018c] 11044 1726853245.22743: sending task result for task 02083763-bbaf-c5a6-f857-00000000018c skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11044 1726853245.22886: no more pending results, returning what we have 11044 1726853245.22889: results queue empty 11044 1726853245.22891: checking for any_errors_fatal 11044 1726853245.22898: done checking for any_errors_fatal 11044 1726853245.22898: checking for max_fail_percentage 11044 1726853245.22900: done checking for max_fail_percentage 11044 1726853245.22901: checking to see if all hosts have failed and the running result is not ok 11044 1726853245.22902: done checking to see if all hosts have failed 11044 1726853245.22903: getting the remaining hosts for this loop 11044 1726853245.22904: done getting the remaining hosts for this loop 11044 1726853245.22908: getting the next task for host managed_node1 11044 1726853245.22917: done getting next task for host managed_node1 11044 1726853245.22921: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11044 1726853245.22925: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853245.22939: getting variables 11044 1726853245.22941: in VariableManager get_vars() 11044 1726853245.22989: Calling all_inventory to load vars for managed_node1 11044 1726853245.22992: Calling groups_inventory to load vars for managed_node1 11044 1726853245.22995: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853245.23006: Calling all_plugins_play to load vars for managed_node1 11044 1726853245.23009: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853245.23012: Calling groups_plugins_play to load vars for managed_node1 11044 1726853245.23542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853245.23752: done with get_vars() 11044 1726853245.23764: done getting variables 11044 1726853245.23800: done sending task result for task 02083763-bbaf-c5a6-f857-00000000018c 11044 1726853245.23803: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:27:25 -0400 (0:00:00.034) 0:00:09.614 ****** 11044 1726853245.23882: entering _queue_task() for managed_node1/service_facts 11044 1726853245.23884: Creating lock for service_facts 11044 1726853245.24181: worker is 1 (out of 1 available) 11044 1726853245.24195: exiting _queue_task() for managed_node1/service_facts 11044 1726853245.24207: done queuing things up, now waiting for results queue to drain 11044 1726853245.24208: waiting for pending results... 11044 1726853245.24489: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 11044 1726853245.24639: in run() - task 02083763-bbaf-c5a6-f857-00000000018e 11044 1726853245.24666: variable 'ansible_search_path' from source: unknown 11044 1726853245.24677: variable 'ansible_search_path' from source: unknown 11044 1726853245.24721: calling self._execute() 11044 1726853245.24808: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853245.24825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853245.24839: variable 'omit' from source: magic vars 11044 1726853245.25219: variable 'ansible_distribution_major_version' from source: facts 11044 1726853245.25237: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853245.25253: variable 'omit' from source: magic vars 11044 1726853245.25327: variable 'omit' from source: magic vars 11044 1726853245.25374: variable 'omit' from source: magic vars 11044 1726853245.25419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853245.25463: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853245.25493: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853245.25516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853245.25534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853245.25584: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853245.25588: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853245.25692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853245.25696: Set connection var ansible_timeout to 10 11044 1726853245.25706: Set connection var ansible_shell_executable to /bin/sh 11044 1726853245.25712: Set connection var ansible_shell_type to sh 11044 1726853245.25719: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853245.25727: Set connection var ansible_connection to ssh 11044 1726853245.25734: Set connection var ansible_pipelining to False 11044 1726853245.25761: variable 'ansible_shell_executable' from source: unknown 11044 1726853245.25767: variable 'ansible_connection' from source: unknown 11044 1726853245.25775: variable 'ansible_module_compression' from source: unknown 11044 1726853245.25780: variable 'ansible_shell_type' from source: unknown 11044 1726853245.25785: variable 'ansible_shell_executable' from source: unknown 11044 1726853245.25790: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853245.25800: variable 'ansible_pipelining' from source: unknown 11044 1726853245.25806: variable 'ansible_timeout' from source: unknown 11044 1726853245.25813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853245.25997: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11044 1726853245.26012: variable 'omit' from source: magic vars 11044 1726853245.26023: starting attempt loop 11044 1726853245.26125: running the handler 11044 1726853245.26128: _low_level_execute_command(): starting 11044 1726853245.26131: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853245.26763: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853245.26879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853245.26902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853245.26918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853245.26999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853245.28700: stdout chunk (state=3): >>>/root <<< 11044 1726853245.28873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853245.28878: stdout chunk (state=3): >>><<< 11044 1726853245.28880: stderr chunk (state=3): >>><<< 11044 1726853245.28901: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853245.29008: _low_level_execute_command(): starting 11044 1726853245.29013: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853245.2890894-11596-163042982443410 `" && echo ansible-tmp-1726853245.2890894-11596-163042982443410="` echo /root/.ansible/tmp/ansible-tmp-1726853245.2890894-11596-163042982443410 `" ) && sleep 0' 11044 1726853245.29751: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853245.29958: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853245.30078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853245.30310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853245.32192: stdout chunk (state=3): >>>ansible-tmp-1726853245.2890894-11596-163042982443410=/root/.ansible/tmp/ansible-tmp-1726853245.2890894-11596-163042982443410 <<< 11044 1726853245.32377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853245.32390: stdout chunk (state=3): >>><<< 11044 1726853245.32403: stderr chunk (state=3): >>><<< 11044 1726853245.32525: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853245.2890894-11596-163042982443410=/root/.ansible/tmp/ansible-tmp-1726853245.2890894-11596-163042982443410 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853245.32610: variable 'ansible_module_compression' from source: unknown 11044 1726853245.32613: ANSIBALLZ: Using lock for service_facts 11044 1726853245.32615: ANSIBALLZ: Acquiring lock 11044 1726853245.32978: ANSIBALLZ: Lock acquired: 140360200271584 11044 1726853245.32982: ANSIBALLZ: Creating module 11044 1726853245.46517: ANSIBALLZ: Writing module into payload 11044 1726853245.46611: ANSIBALLZ: Writing module 11044 1726853245.46638: ANSIBALLZ: Renaming module 11044 1726853245.46643: ANSIBALLZ: Done creating module 11044 1726853245.46669: variable 'ansible_facts' from source: unknown 11044 1726853245.46815: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853245.2890894-11596-163042982443410/AnsiballZ_service_facts.py 11044 1726853245.46990: Sending initial data 11044 1726853245.46995: Sent initial data (162 bytes) 11044 1726853245.47586: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853245.47646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853245.47667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853245.47696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853245.47785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853245.49431: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853245.49472: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853245.49519: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853245.2890894-11596-163042982443410/AnsiballZ_service_facts.py" <<< 11044 1726853245.49561: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmph54obo1y /root/.ansible/tmp/ansible-tmp-1726853245.2890894-11596-163042982443410/AnsiballZ_service_facts.py <<< 11044 1726853245.49565: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmph54obo1y" to remote "/root/.ansible/tmp/ansible-tmp-1726853245.2890894-11596-163042982443410/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853245.2890894-11596-163042982443410/AnsiballZ_service_facts.py" <<< 11044 1726853245.50739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853245.50743: stdout chunk (state=3): >>><<< 11044 1726853245.50746: stderr chunk (state=3): >>><<< 11044 1726853245.50748: done transferring module to remote 11044 1726853245.50750: _low_level_execute_command(): starting 11044 1726853245.50753: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853245.2890894-11596-163042982443410/ /root/.ansible/tmp/ansible-tmp-1726853245.2890894-11596-163042982443410/AnsiballZ_service_facts.py && sleep 0' 11044 1726853245.51339: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853245.51353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853245.51370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853245.51393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853245.51511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853245.51515: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853245.51565: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853245.51603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853245.53428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853245.53439: stdout chunk (state=3): >>><<< 11044 1726853245.53451: stderr chunk (state=3): >>><<< 11044 1726853245.53476: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853245.53558: _low_level_execute_command(): starting 11044 1726853245.53561: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853245.2890894-11596-163042982443410/AnsiballZ_service_facts.py && sleep 0' 11044 1726853245.54098: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853245.54114: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853245.54188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853245.54232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853245.54247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853245.54276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853245.54403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853247.07503: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 11044 1726853247.07536: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 11044 1726853247.07596: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11044 1726853247.09284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853247.09287: stdout chunk (state=3): >>><<< 11044 1726853247.09290: stderr chunk (state=3): >>><<< 11044 1726853247.09293: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853247.11143: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853245.2890894-11596-163042982443410/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853247.11175: _low_level_execute_command(): starting 11044 1726853247.11185: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853245.2890894-11596-163042982443410/ > /dev/null 2>&1 && sleep 0' 11044 1726853247.12013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853247.12031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853247.12290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853247.12306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853247.12440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853247.14361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853247.14365: stdout chunk (state=3): >>><<< 11044 1726853247.14369: stderr chunk (state=3): >>><<< 11044 1726853247.14783: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853247.14787: handler run complete 11044 1726853247.15127: variable 'ansible_facts' from source: unknown 11044 1726853247.15739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853247.16873: variable 'ansible_facts' from source: unknown 11044 1726853247.17126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853247.17660: attempt loop complete, returning result 11044 1726853247.17734: _execute() done 11044 1726853247.17742: dumping result to json 11044 1726853247.17913: done dumping result, returning 11044 1726853247.18163: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-c5a6-f857-00000000018e] 11044 1726853247.18166: sending task result for task 02083763-bbaf-c5a6-f857-00000000018e 11044 1726853247.20033: done sending task result for task 02083763-bbaf-c5a6-f857-00000000018e 11044 1726853247.20037: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11044 1726853247.20141: no more pending results, returning what we have 11044 1726853247.20147: results queue empty 11044 1726853247.20148: checking for any_errors_fatal 11044 1726853247.20151: done checking for any_errors_fatal 11044 1726853247.20152: checking for max_fail_percentage 11044 1726853247.20154: done checking for max_fail_percentage 11044 1726853247.20155: checking to see if all hosts have failed and the running result is not ok 11044 1726853247.20155: done checking to see if all hosts have failed 11044 1726853247.20156: getting the remaining hosts for this loop 11044 1726853247.20157: done getting the remaining hosts for this loop 11044 1726853247.20160: getting the next task for host managed_node1 11044 1726853247.20166: done getting next task for host managed_node1 11044 1726853247.20169: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11044 1726853247.20174: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853247.20184: getting variables 11044 1726853247.20186: in VariableManager get_vars() 11044 1726853247.20228: Calling all_inventory to load vars for managed_node1 11044 1726853247.20231: Calling groups_inventory to load vars for managed_node1 11044 1726853247.20233: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853247.20243: Calling all_plugins_play to load vars for managed_node1 11044 1726853247.20248: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853247.20251: Calling groups_plugins_play to load vars for managed_node1 11044 1726853247.21797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853247.22314: done with get_vars() 11044 1726853247.22327: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:27:27 -0400 (0:00:01.985) 0:00:11.600 ****** 11044 1726853247.22430: entering _queue_task() for managed_node1/package_facts 11044 1726853247.22431: Creating lock for package_facts 11044 1726853247.22691: worker is 1 (out of 1 available) 11044 1726853247.22705: exiting _queue_task() for managed_node1/package_facts 11044 1726853247.22718: done queuing things up, now waiting for results queue to drain 11044 1726853247.22720: waiting for pending results... 11044 1726853247.22883: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 11044 1726853247.22970: in run() - task 02083763-bbaf-c5a6-f857-00000000018f 11044 1726853247.22984: variable 'ansible_search_path' from source: unknown 11044 1726853247.22987: variable 'ansible_search_path' from source: unknown 11044 1726853247.23015: calling self._execute() 11044 1726853247.23075: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853247.23082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853247.23089: variable 'omit' from source: magic vars 11044 1726853247.23369: variable 'ansible_distribution_major_version' from source: facts 11044 1726853247.23381: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853247.23390: variable 'omit' from source: magic vars 11044 1726853247.23452: variable 'omit' from source: magic vars 11044 1726853247.23484: variable 'omit' from source: magic vars 11044 1726853247.23514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853247.23576: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853247.23579: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853247.23584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853247.23604: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853247.23629: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853247.23632: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853247.23635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853247.23726: Set connection var ansible_timeout to 10 11044 1726853247.23738: Set connection var ansible_shell_executable to /bin/sh 11044 1726853247.23741: Set connection var ansible_shell_type to sh 11044 1726853247.23746: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853247.23753: Set connection var ansible_connection to ssh 11044 1726853247.23757: Set connection var ansible_pipelining to False 11044 1726853247.23777: variable 'ansible_shell_executable' from source: unknown 11044 1726853247.23779: variable 'ansible_connection' from source: unknown 11044 1726853247.23782: variable 'ansible_module_compression' from source: unknown 11044 1726853247.23785: variable 'ansible_shell_type' from source: unknown 11044 1726853247.23787: variable 'ansible_shell_executable' from source: unknown 11044 1726853247.23789: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853247.23792: variable 'ansible_pipelining' from source: unknown 11044 1726853247.23804: variable 'ansible_timeout' from source: unknown 11044 1726853247.23807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853247.24132: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11044 1726853247.24138: variable 'omit' from source: magic vars 11044 1726853247.24140: starting attempt loop 11044 1726853247.24142: running the handler 11044 1726853247.24144: _low_level_execute_command(): starting 11044 1726853247.24146: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853247.25045: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853247.25154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853247.25176: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853247.25197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853247.26889: stdout chunk (state=3): >>>/root <<< 11044 1726853247.27012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853247.27061: stderr chunk (state=3): >>><<< 11044 1726853247.27065: stdout chunk (state=3): >>><<< 11044 1726853247.27165: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853247.27169: _low_level_execute_command(): starting 11044 1726853247.27173: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853247.270841-11696-229485926353825 `" && echo ansible-tmp-1726853247.270841-11696-229485926353825="` echo /root/.ansible/tmp/ansible-tmp-1726853247.270841-11696-229485926353825 `" ) && sleep 0' 11044 1726853247.27702: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853247.27717: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853247.27730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853247.27750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853247.27768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853247.27783: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853247.27798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853247.27826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853247.27893: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853247.27927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853247.27945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853247.28082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853247.28120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853247.29998: stdout chunk (state=3): >>>ansible-tmp-1726853247.270841-11696-229485926353825=/root/.ansible/tmp/ansible-tmp-1726853247.270841-11696-229485926353825 <<< 11044 1726853247.30180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853247.30184: stdout chunk (state=3): >>><<< 11044 1726853247.30186: stderr chunk (state=3): >>><<< 11044 1726853247.30189: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853247.270841-11696-229485926353825=/root/.ansible/tmp/ansible-tmp-1726853247.270841-11696-229485926353825 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853247.30218: variable 'ansible_module_compression' from source: unknown 11044 1726853247.30269: ANSIBALLZ: Using lock for package_facts 11044 1726853247.30274: ANSIBALLZ: Acquiring lock 11044 1726853247.30276: ANSIBALLZ: Lock acquired: 140360200381472 11044 1726853247.30378: ANSIBALLZ: Creating module 11044 1726853247.63880: ANSIBALLZ: Writing module into payload 11044 1726853247.64077: ANSIBALLZ: Writing module 11044 1726853247.64081: ANSIBALLZ: Renaming module 11044 1726853247.64097: ANSIBALLZ: Done creating module 11044 1726853247.64149: variable 'ansible_facts' from source: unknown 11044 1726853247.64365: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853247.270841-11696-229485926353825/AnsiballZ_package_facts.py 11044 1726853247.64556: Sending initial data 11044 1726853247.64559: Sent initial data (161 bytes) 11044 1726853247.65276: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853247.65297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853247.65338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853247.65368: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853247.65483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853247.67143: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11044 1726853247.67173: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853247.67202: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853247.67301: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpih871mss /root/.ansible/tmp/ansible-tmp-1726853247.270841-11696-229485926353825/AnsiballZ_package_facts.py <<< 11044 1726853247.67308: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853247.270841-11696-229485926353825/AnsiballZ_package_facts.py" <<< 11044 1726853247.67576: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpih871mss" to remote "/root/.ansible/tmp/ansible-tmp-1726853247.270841-11696-229485926353825/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853247.270841-11696-229485926353825/AnsiballZ_package_facts.py" <<< 11044 1726853247.69655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853247.69658: stdout chunk (state=3): >>><<< 11044 1726853247.69668: stderr chunk (state=3): >>><<< 11044 1726853247.69697: done transferring module to remote 11044 1726853247.69708: _low_level_execute_command(): starting 11044 1726853247.69713: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853247.270841-11696-229485926353825/ /root/.ansible/tmp/ansible-tmp-1726853247.270841-11696-229485926353825/AnsiballZ_package_facts.py && sleep 0' 11044 1726853247.70592: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853247.70684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853247.70709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853247.70727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853247.70843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853247.70977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853247.72789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853247.72862: stderr chunk (state=3): >>><<< 11044 1726853247.72865: stdout chunk (state=3): >>><<< 11044 1726853247.72881: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853247.72884: _low_level_execute_command(): starting 11044 1726853247.72889: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853247.270841-11696-229485926353825/AnsiballZ_package_facts.py && sleep 0' 11044 1726853247.74257: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853247.74426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853247.74480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853247.74612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853247.74659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853248.18667: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 11044 1726853248.18693: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 11044 1726853248.18886: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 11044 1726853248.18951: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11044 1726853248.20976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853248.20980: stdout chunk (state=3): >>><<< 11044 1726853248.20985: stderr chunk (state=3): >>><<< 11044 1726853248.21184: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853248.25479: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853247.270841-11696-229485926353825/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853248.25484: _low_level_execute_command(): starting 11044 1726853248.25486: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853247.270841-11696-229485926353825/ > /dev/null 2>&1 && sleep 0' 11044 1726853248.26290: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853248.26297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853248.26313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853248.26586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853248.26621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853248.26748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853248.28562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853248.28618: stderr chunk (state=3): >>><<< 11044 1726853248.28623: stdout chunk (state=3): >>><<< 11044 1726853248.28655: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853248.28661: handler run complete 11044 1726853248.29944: variable 'ansible_facts' from source: unknown 11044 1726853248.30768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853248.33046: variable 'ansible_facts' from source: unknown 11044 1726853248.33607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853248.34279: attempt loop complete, returning result 11044 1726853248.34290: _execute() done 11044 1726853248.34293: dumping result to json 11044 1726853248.34549: done dumping result, returning 11044 1726853248.34552: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-c5a6-f857-00000000018f] 11044 1726853248.34560: sending task result for task 02083763-bbaf-c5a6-f857-00000000018f 11044 1726853248.37508: done sending task result for task 02083763-bbaf-c5a6-f857-00000000018f 11044 1726853248.37512: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11044 1726853248.37607: no more pending results, returning what we have 11044 1726853248.37613: results queue empty 11044 1726853248.37614: checking for any_errors_fatal 11044 1726853248.37618: done checking for any_errors_fatal 11044 1726853248.37619: checking for max_fail_percentage 11044 1726853248.37620: done checking for max_fail_percentage 11044 1726853248.37621: checking to see if all hosts have failed and the running result is not ok 11044 1726853248.37622: done checking to see if all hosts have failed 11044 1726853248.37622: getting the remaining hosts for this loop 11044 1726853248.37623: done getting the remaining hosts for this loop 11044 1726853248.37627: getting the next task for host managed_node1 11044 1726853248.37633: done getting next task for host managed_node1 11044 1726853248.37636: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11044 1726853248.37641: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853248.37654: getting variables 11044 1726853248.37655: in VariableManager get_vars() 11044 1726853248.37688: Calling all_inventory to load vars for managed_node1 11044 1726853248.37691: Calling groups_inventory to load vars for managed_node1 11044 1726853248.37693: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853248.37702: Calling all_plugins_play to load vars for managed_node1 11044 1726853248.37704: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853248.37707: Calling groups_plugins_play to load vars for managed_node1 11044 1726853248.39002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853248.41366: done with get_vars() 11044 1726853248.41396: done getting variables 11044 1726853248.41583: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:27:28 -0400 (0:00:01.191) 0:00:12.791 ****** 11044 1726853248.41653: entering _queue_task() for managed_node1/debug 11044 1726853248.42053: worker is 1 (out of 1 available) 11044 1726853248.42181: exiting _queue_task() for managed_node1/debug 11044 1726853248.42193: done queuing things up, now waiting for results queue to drain 11044 1726853248.42195: waiting for pending results... 11044 1726853248.42402: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 11044 1726853248.42608: in run() - task 02083763-bbaf-c5a6-f857-000000000027 11044 1726853248.42613: variable 'ansible_search_path' from source: unknown 11044 1726853248.42615: variable 'ansible_search_path' from source: unknown 11044 1726853248.42619: calling self._execute() 11044 1726853248.42722: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853248.42737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853248.42760: variable 'omit' from source: magic vars 11044 1726853248.43167: variable 'ansible_distribution_major_version' from source: facts 11044 1726853248.43190: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853248.43202: variable 'omit' from source: magic vars 11044 1726853248.43374: variable 'omit' from source: magic vars 11044 1726853248.43412: variable 'network_provider' from source: set_fact 11044 1726853248.43437: variable 'omit' from source: magic vars 11044 1726853248.43546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853248.43616: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853248.43651: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853248.43702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853248.43765: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853248.43859: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853248.43968: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853248.43974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853248.44107: Set connection var ansible_timeout to 10 11044 1726853248.44149: Set connection var ansible_shell_executable to /bin/sh 11044 1726853248.44192: Set connection var ansible_shell_type to sh 11044 1726853248.44202: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853248.44211: Set connection var ansible_connection to ssh 11044 1726853248.44254: Set connection var ansible_pipelining to False 11044 1726853248.44469: variable 'ansible_shell_executable' from source: unknown 11044 1726853248.44475: variable 'ansible_connection' from source: unknown 11044 1726853248.44477: variable 'ansible_module_compression' from source: unknown 11044 1726853248.44479: variable 'ansible_shell_type' from source: unknown 11044 1726853248.44481: variable 'ansible_shell_executable' from source: unknown 11044 1726853248.44483: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853248.44485: variable 'ansible_pipelining' from source: unknown 11044 1726853248.44487: variable 'ansible_timeout' from source: unknown 11044 1726853248.44489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853248.44718: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853248.44741: variable 'omit' from source: magic vars 11044 1726853248.44755: starting attempt loop 11044 1726853248.44794: running the handler 11044 1726853248.44978: handler run complete 11044 1726853248.44982: attempt loop complete, returning result 11044 1726853248.44985: _execute() done 11044 1726853248.44987: dumping result to json 11044 1726853248.44990: done dumping result, returning 11044 1726853248.44992: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-c5a6-f857-000000000027] 11044 1726853248.44994: sending task result for task 02083763-bbaf-c5a6-f857-000000000027 ok: [managed_node1] => {} MSG: Using network provider: nm 11044 1726853248.45233: no more pending results, returning what we have 11044 1726853248.45237: results queue empty 11044 1726853248.45238: checking for any_errors_fatal 11044 1726853248.45251: done checking for any_errors_fatal 11044 1726853248.45252: checking for max_fail_percentage 11044 1726853248.45255: done checking for max_fail_percentage 11044 1726853248.45256: checking to see if all hosts have failed and the running result is not ok 11044 1726853248.45257: done checking to see if all hosts have failed 11044 1726853248.45257: getting the remaining hosts for this loop 11044 1726853248.45259: done getting the remaining hosts for this loop 11044 1726853248.45262: getting the next task for host managed_node1 11044 1726853248.45270: done getting next task for host managed_node1 11044 1726853248.45277: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11044 1726853248.45283: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853248.45294: getting variables 11044 1726853248.45295: in VariableManager get_vars() 11044 1726853248.45341: Calling all_inventory to load vars for managed_node1 11044 1726853248.45347: Calling groups_inventory to load vars for managed_node1 11044 1726853248.45350: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853248.45361: Calling all_plugins_play to load vars for managed_node1 11044 1726853248.45364: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853248.45367: Calling groups_plugins_play to load vars for managed_node1 11044 1726853248.46528: done sending task result for task 02083763-bbaf-c5a6-f857-000000000027 11044 1726853248.46533: WORKER PROCESS EXITING 11044 1726853248.48569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853248.51161: done with get_vars() 11044 1726853248.51193: done getting variables 11044 1726853248.51418: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:27:28 -0400 (0:00:00.098) 0:00:12.890 ****** 11044 1726853248.51456: entering _queue_task() for managed_node1/fail 11044 1726853248.51458: Creating lock for fail 11044 1726853248.52167: worker is 1 (out of 1 available) 11044 1726853248.52296: exiting _queue_task() for managed_node1/fail 11044 1726853248.52309: done queuing things up, now waiting for results queue to drain 11044 1726853248.52310: waiting for pending results... 11044 1726853248.53038: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11044 1726853248.53315: in run() - task 02083763-bbaf-c5a6-f857-000000000028 11044 1726853248.53352: variable 'ansible_search_path' from source: unknown 11044 1726853248.53356: variable 'ansible_search_path' from source: unknown 11044 1726853248.53393: calling self._execute() 11044 1726853248.53590: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853248.53595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853248.53607: variable 'omit' from source: magic vars 11044 1726853248.53982: variable 'ansible_distribution_major_version' from source: facts 11044 1726853248.53994: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853248.54119: variable 'network_state' from source: role '' defaults 11044 1726853248.54128: Evaluated conditional (network_state != {}): False 11044 1726853248.54132: when evaluation is False, skipping this task 11044 1726853248.54134: _execute() done 11044 1726853248.54137: dumping result to json 11044 1726853248.54139: done dumping result, returning 11044 1726853248.54153: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-c5a6-f857-000000000028] 11044 1726853248.54156: sending task result for task 02083763-bbaf-c5a6-f857-000000000028 11044 1726853248.54242: done sending task result for task 02083763-bbaf-c5a6-f857-000000000028 11044 1726853248.54247: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11044 1726853248.54299: no more pending results, returning what we have 11044 1726853248.54303: results queue empty 11044 1726853248.54304: checking for any_errors_fatal 11044 1726853248.54310: done checking for any_errors_fatal 11044 1726853248.54311: checking for max_fail_percentage 11044 1726853248.54312: done checking for max_fail_percentage 11044 1726853248.54313: checking to see if all hosts have failed and the running result is not ok 11044 1726853248.54314: done checking to see if all hosts have failed 11044 1726853248.54315: getting the remaining hosts for this loop 11044 1726853248.54316: done getting the remaining hosts for this loop 11044 1726853248.54319: getting the next task for host managed_node1 11044 1726853248.54324: done getting next task for host managed_node1 11044 1726853248.54327: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11044 1726853248.54331: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853248.54349: getting variables 11044 1726853248.54350: in VariableManager get_vars() 11044 1726853248.54391: Calling all_inventory to load vars for managed_node1 11044 1726853248.54394: Calling groups_inventory to load vars for managed_node1 11044 1726853248.54396: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853248.54404: Calling all_plugins_play to load vars for managed_node1 11044 1726853248.54406: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853248.54409: Calling groups_plugins_play to load vars for managed_node1 11044 1726853248.55718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853248.57863: done with get_vars() 11044 1726853248.57929: done getting variables 11044 1726853248.58056: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:27:28 -0400 (0:00:00.066) 0:00:12.957 ****** 11044 1726853248.58151: entering _queue_task() for managed_node1/fail 11044 1726853248.58801: worker is 1 (out of 1 available) 11044 1726853248.58814: exiting _queue_task() for managed_node1/fail 11044 1726853248.58826: done queuing things up, now waiting for results queue to drain 11044 1726853248.58827: waiting for pending results... 11044 1726853248.59212: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11044 1726853248.59297: in run() - task 02083763-bbaf-c5a6-f857-000000000029 11044 1726853248.59311: variable 'ansible_search_path' from source: unknown 11044 1726853248.59314: variable 'ansible_search_path' from source: unknown 11044 1726853248.59450: calling self._execute() 11044 1726853248.59478: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853248.59503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853248.59512: variable 'omit' from source: magic vars 11044 1726853248.59953: variable 'ansible_distribution_major_version' from source: facts 11044 1726853248.59965: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853248.60150: variable 'network_state' from source: role '' defaults 11044 1726853248.60163: Evaluated conditional (network_state != {}): False 11044 1726853248.60173: when evaluation is False, skipping this task 11044 1726853248.60177: _execute() done 11044 1726853248.60179: dumping result to json 11044 1726853248.60182: done dumping result, returning 11044 1726853248.60231: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-c5a6-f857-000000000029] 11044 1726853248.60236: sending task result for task 02083763-bbaf-c5a6-f857-000000000029 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11044 1726853248.60443: no more pending results, returning what we have 11044 1726853248.60447: results queue empty 11044 1726853248.60449: checking for any_errors_fatal 11044 1726853248.60456: done checking for any_errors_fatal 11044 1726853248.60457: checking for max_fail_percentage 11044 1726853248.60459: done checking for max_fail_percentage 11044 1726853248.60460: checking to see if all hosts have failed and the running result is not ok 11044 1726853248.60461: done checking to see if all hosts have failed 11044 1726853248.60462: getting the remaining hosts for this loop 11044 1726853248.60463: done getting the remaining hosts for this loop 11044 1726853248.60467: getting the next task for host managed_node1 11044 1726853248.60475: done getting next task for host managed_node1 11044 1726853248.60480: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11044 1726853248.60484: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853248.60502: done sending task result for task 02083763-bbaf-c5a6-f857-000000000029 11044 1726853248.60505: WORKER PROCESS EXITING 11044 1726853248.60581: getting variables 11044 1726853248.60583: in VariableManager get_vars() 11044 1726853248.60629: Calling all_inventory to load vars for managed_node1 11044 1726853248.60635: Calling groups_inventory to load vars for managed_node1 11044 1726853248.60639: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853248.60654: Calling all_plugins_play to load vars for managed_node1 11044 1726853248.60657: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853248.60660: Calling groups_plugins_play to load vars for managed_node1 11044 1726853248.62530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853248.64363: done with get_vars() 11044 1726853248.64387: done getting variables 11044 1726853248.64445: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:27:28 -0400 (0:00:00.063) 0:00:13.020 ****** 11044 1726853248.64484: entering _queue_task() for managed_node1/fail 11044 1726853248.64884: worker is 1 (out of 1 available) 11044 1726853248.64899: exiting _queue_task() for managed_node1/fail 11044 1726853248.64910: done queuing things up, now waiting for results queue to drain 11044 1726853248.64912: waiting for pending results... 11044 1726853248.65153: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11044 1726853248.65234: in run() - task 02083763-bbaf-c5a6-f857-00000000002a 11044 1726853248.65251: variable 'ansible_search_path' from source: unknown 11044 1726853248.65256: variable 'ansible_search_path' from source: unknown 11044 1726853248.65355: calling self._execute() 11044 1726853248.65379: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853248.65385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853248.65395: variable 'omit' from source: magic vars 11044 1726853248.65876: variable 'ansible_distribution_major_version' from source: facts 11044 1726853248.65880: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853248.65964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853248.69741: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853248.69818: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853248.69909: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853248.69969: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853248.69996: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853248.70080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853248.70134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853248.70184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853248.70218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853248.70235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853248.70378: variable 'ansible_distribution_major_version' from source: facts 11044 1726853248.70393: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11044 1726853248.70514: variable 'ansible_distribution' from source: facts 11044 1726853248.70518: variable '__network_rh_distros' from source: role '' defaults 11044 1726853248.70527: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11044 1726853248.70946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853248.70949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853248.70952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853248.70954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853248.70957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853248.70959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853248.70981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853248.71002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853248.71052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853248.71063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853248.71110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853248.71138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853248.71166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853248.71206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853248.71227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853248.71778: variable 'network_connections' from source: task vars 11044 1726853248.71781: variable 'controller_profile' from source: play vars 11044 1726853248.71784: variable 'controller_profile' from source: play vars 11044 1726853248.71786: variable 'controller_device' from source: play vars 11044 1726853248.71788: variable 'controller_device' from source: play vars 11044 1726853248.71790: variable 'port1_profile' from source: play vars 11044 1726853248.71792: variable 'port1_profile' from source: play vars 11044 1726853248.71800: variable 'dhcp_interface1' from source: play vars 11044 1726853248.71866: variable 'dhcp_interface1' from source: play vars 11044 1726853248.71869: variable 'controller_profile' from source: play vars 11044 1726853248.71933: variable 'controller_profile' from source: play vars 11044 1726853248.71943: variable 'port2_profile' from source: play vars 11044 1726853248.72014: variable 'port2_profile' from source: play vars 11044 1726853248.72025: variable 'dhcp_interface2' from source: play vars 11044 1726853248.72083: variable 'dhcp_interface2' from source: play vars 11044 1726853248.72090: variable 'controller_profile' from source: play vars 11044 1726853248.72223: variable 'controller_profile' from source: play vars 11044 1726853248.72227: variable 'network_state' from source: role '' defaults 11044 1726853248.72331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853248.72519: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853248.72522: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853248.72525: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853248.72565: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853248.72611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853248.72767: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853248.72772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853248.72775: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853248.72777: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11044 1726853248.72779: when evaluation is False, skipping this task 11044 1726853248.72781: _execute() done 11044 1726853248.72782: dumping result to json 11044 1726853248.72784: done dumping result, returning 11044 1726853248.72785: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-c5a6-f857-00000000002a] 11044 1726853248.72787: sending task result for task 02083763-bbaf-c5a6-f857-00000000002a 11044 1726853248.72850: done sending task result for task 02083763-bbaf-c5a6-f857-00000000002a 11044 1726853248.72854: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11044 1726853248.72907: no more pending results, returning what we have 11044 1726853248.72911: results queue empty 11044 1726853248.72912: checking for any_errors_fatal 11044 1726853248.72917: done checking for any_errors_fatal 11044 1726853248.72918: checking for max_fail_percentage 11044 1726853248.72919: done checking for max_fail_percentage 11044 1726853248.72920: checking to see if all hosts have failed and the running result is not ok 11044 1726853248.72921: done checking to see if all hosts have failed 11044 1726853248.72922: getting the remaining hosts for this loop 11044 1726853248.72923: done getting the remaining hosts for this loop 11044 1726853248.72927: getting the next task for host managed_node1 11044 1726853248.72934: done getting next task for host managed_node1 11044 1726853248.72942: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11044 1726853248.72945: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853248.72960: getting variables 11044 1726853248.72962: in VariableManager get_vars() 11044 1726853248.73006: Calling all_inventory to load vars for managed_node1 11044 1726853248.73010: Calling groups_inventory to load vars for managed_node1 11044 1726853248.73233: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853248.73244: Calling all_plugins_play to load vars for managed_node1 11044 1726853248.73246: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853248.73249: Calling groups_plugins_play to load vars for managed_node1 11044 1726853248.75158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853248.77061: done with get_vars() 11044 1726853248.77101: done getting variables 11044 1726853248.77224: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:27:28 -0400 (0:00:00.127) 0:00:13.148 ****** 11044 1726853248.77258: entering _queue_task() for managed_node1/dnf 11044 1726853248.77606: worker is 1 (out of 1 available) 11044 1726853248.77618: exiting _queue_task() for managed_node1/dnf 11044 1726853248.77632: done queuing things up, now waiting for results queue to drain 11044 1726853248.77634: waiting for pending results... 11044 1726853248.77976: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11044 1726853248.78239: in run() - task 02083763-bbaf-c5a6-f857-00000000002b 11044 1726853248.78250: variable 'ansible_search_path' from source: unknown 11044 1726853248.78258: variable 'ansible_search_path' from source: unknown 11044 1726853248.78264: calling self._execute() 11044 1726853248.78590: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853248.78594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853248.78597: variable 'omit' from source: magic vars 11044 1726853248.79131: variable 'ansible_distribution_major_version' from source: facts 11044 1726853248.79152: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853248.79752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853248.82660: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853248.82732: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853248.82789: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853248.82843: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853248.82870: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853248.82967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853248.82996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853248.83020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853248.83069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853248.83086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853248.83201: variable 'ansible_distribution' from source: facts 11044 1726853248.83205: variable 'ansible_distribution_major_version' from source: facts 11044 1726853248.83220: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11044 1726853248.83377: variable '__network_wireless_connections_defined' from source: role '' defaults 11044 1726853248.83537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853248.83560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853248.83593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853248.83631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853248.83657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853248.83706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853248.83729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853248.83752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853248.83790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853248.83812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853248.83850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853248.83870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853248.83895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853248.83935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853248.83990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853248.84225: variable 'network_connections' from source: task vars 11044 1726853248.84258: variable 'controller_profile' from source: play vars 11044 1726853248.84373: variable 'controller_profile' from source: play vars 11044 1726853248.84386: variable 'controller_device' from source: play vars 11044 1726853248.84481: variable 'controller_device' from source: play vars 11044 1726853248.84485: variable 'port1_profile' from source: play vars 11044 1726853248.84524: variable 'port1_profile' from source: play vars 11044 1726853248.84536: variable 'dhcp_interface1' from source: play vars 11044 1726853248.84605: variable 'dhcp_interface1' from source: play vars 11044 1726853248.84676: variable 'controller_profile' from source: play vars 11044 1726853248.84679: variable 'controller_profile' from source: play vars 11044 1726853248.84683: variable 'port2_profile' from source: play vars 11044 1726853248.84732: variable 'port2_profile' from source: play vars 11044 1726853248.84738: variable 'dhcp_interface2' from source: play vars 11044 1726853248.84813: variable 'dhcp_interface2' from source: play vars 11044 1726853248.84876: variable 'controller_profile' from source: play vars 11044 1726853248.84879: variable 'controller_profile' from source: play vars 11044 1726853248.84960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853248.85150: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853248.85201: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853248.85233: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853248.85261: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853248.85310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853248.85350: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853248.85393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853248.85458: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853248.85537: variable '__network_team_connections_defined' from source: role '' defaults 11044 1726853248.85814: variable 'network_connections' from source: task vars 11044 1726853248.85819: variable 'controller_profile' from source: play vars 11044 1726853248.85877: variable 'controller_profile' from source: play vars 11044 1726853248.85924: variable 'controller_device' from source: play vars 11044 1726853248.85995: variable 'controller_device' from source: play vars 11044 1726853248.85998: variable 'port1_profile' from source: play vars 11044 1726853248.86116: variable 'port1_profile' from source: play vars 11044 1726853248.86119: variable 'dhcp_interface1' from source: play vars 11044 1726853248.86122: variable 'dhcp_interface1' from source: play vars 11044 1726853248.86124: variable 'controller_profile' from source: play vars 11044 1726853248.86195: variable 'controller_profile' from source: play vars 11044 1726853248.86202: variable 'port2_profile' from source: play vars 11044 1726853248.86264: variable 'port2_profile' from source: play vars 11044 1726853248.86330: variable 'dhcp_interface2' from source: play vars 11044 1726853248.86395: variable 'dhcp_interface2' from source: play vars 11044 1726853248.86398: variable 'controller_profile' from source: play vars 11044 1726853248.86463: variable 'controller_profile' from source: play vars 11044 1726853248.86497: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11044 1726853248.86500: when evaluation is False, skipping this task 11044 1726853248.86503: _execute() done 11044 1726853248.86505: dumping result to json 11044 1726853248.86560: done dumping result, returning 11044 1726853248.86565: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-c5a6-f857-00000000002b] 11044 1726853248.86567: sending task result for task 02083763-bbaf-c5a6-f857-00000000002b 11044 1726853248.86629: done sending task result for task 02083763-bbaf-c5a6-f857-00000000002b 11044 1726853248.86631: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11044 1726853248.86712: no more pending results, returning what we have 11044 1726853248.86715: results queue empty 11044 1726853248.86716: checking for any_errors_fatal 11044 1726853248.86723: done checking for any_errors_fatal 11044 1726853248.86724: checking for max_fail_percentage 11044 1726853248.86725: done checking for max_fail_percentage 11044 1726853248.86726: checking to see if all hosts have failed and the running result is not ok 11044 1726853248.86727: done checking to see if all hosts have failed 11044 1726853248.86727: getting the remaining hosts for this loop 11044 1726853248.86729: done getting the remaining hosts for this loop 11044 1726853248.86732: getting the next task for host managed_node1 11044 1726853248.86737: done getting next task for host managed_node1 11044 1726853248.86894: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11044 1726853248.86898: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853248.86912: getting variables 11044 1726853248.86913: in VariableManager get_vars() 11044 1726853248.86951: Calling all_inventory to load vars for managed_node1 11044 1726853248.86954: Calling groups_inventory to load vars for managed_node1 11044 1726853248.86956: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853248.86965: Calling all_plugins_play to load vars for managed_node1 11044 1726853248.86973: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853248.86978: Calling groups_plugins_play to load vars for managed_node1 11044 1726853248.88363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853248.90197: done with get_vars() 11044 1726853248.90229: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11044 1726853248.90330: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:27:28 -0400 (0:00:00.131) 0:00:13.279 ****** 11044 1726853248.90406: entering _queue_task() for managed_node1/yum 11044 1726853248.90409: Creating lock for yum 11044 1726853248.90839: worker is 1 (out of 1 available) 11044 1726853248.90852: exiting _queue_task() for managed_node1/yum 11044 1726853248.90865: done queuing things up, now waiting for results queue to drain 11044 1726853248.90866: waiting for pending results... 11044 1726853248.91352: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11044 1726853248.91442: in run() - task 02083763-bbaf-c5a6-f857-00000000002c 11044 1726853248.91450: variable 'ansible_search_path' from source: unknown 11044 1726853248.91453: variable 'ansible_search_path' from source: unknown 11044 1726853248.91455: calling self._execute() 11044 1726853248.91528: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853248.91535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853248.91551: variable 'omit' from source: magic vars 11044 1726853248.91940: variable 'ansible_distribution_major_version' from source: facts 11044 1726853248.91952: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853248.92212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853248.95252: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853248.95310: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853248.95350: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853248.95383: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853248.95407: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853248.95570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853248.95578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853248.95585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853248.95588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853248.95606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853248.95750: variable 'ansible_distribution_major_version' from source: facts 11044 1726853248.95780: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11044 1726853248.95783: when evaluation is False, skipping this task 11044 1726853248.95786: _execute() done 11044 1726853248.95791: dumping result to json 11044 1726853248.95793: done dumping result, returning 11044 1726853248.95802: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-c5a6-f857-00000000002c] 11044 1726853248.95805: sending task result for task 02083763-bbaf-c5a6-f857-00000000002c skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11044 1726853248.95951: no more pending results, returning what we have 11044 1726853248.95955: results queue empty 11044 1726853248.95956: checking for any_errors_fatal 11044 1726853248.95961: done checking for any_errors_fatal 11044 1726853248.95962: checking for max_fail_percentage 11044 1726853248.95963: done checking for max_fail_percentage 11044 1726853248.95964: checking to see if all hosts have failed and the running result is not ok 11044 1726853248.95965: done checking to see if all hosts have failed 11044 1726853248.95966: getting the remaining hosts for this loop 11044 1726853248.95967: done getting the remaining hosts for this loop 11044 1726853248.95974: getting the next task for host managed_node1 11044 1726853248.95983: done getting next task for host managed_node1 11044 1726853248.95987: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11044 1726853248.95991: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853248.96007: done sending task result for task 02083763-bbaf-c5a6-f857-00000000002c 11044 1726853248.96010: WORKER PROCESS EXITING 11044 1726853248.96190: getting variables 11044 1726853248.96192: in VariableManager get_vars() 11044 1726853248.96231: Calling all_inventory to load vars for managed_node1 11044 1726853248.96234: Calling groups_inventory to load vars for managed_node1 11044 1726853248.96237: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853248.96245: Calling all_plugins_play to load vars for managed_node1 11044 1726853248.96248: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853248.96251: Calling groups_plugins_play to load vars for managed_node1 11044 1726853248.97824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853248.99624: done with get_vars() 11044 1726853248.99655: done getting variables 11044 1726853248.99714: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:27:28 -0400 (0:00:00.093) 0:00:13.373 ****** 11044 1726853248.99758: entering _queue_task() for managed_node1/fail 11044 1726853249.00394: worker is 1 (out of 1 available) 11044 1726853249.00404: exiting _queue_task() for managed_node1/fail 11044 1726853249.00414: done queuing things up, now waiting for results queue to drain 11044 1726853249.00415: waiting for pending results... 11044 1726853249.00523: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11044 1726853249.00665: in run() - task 02083763-bbaf-c5a6-f857-00000000002d 11044 1726853249.00686: variable 'ansible_search_path' from source: unknown 11044 1726853249.00694: variable 'ansible_search_path' from source: unknown 11044 1726853249.00732: calling self._execute() 11044 1726853249.00896: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853249.00907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853249.00919: variable 'omit' from source: magic vars 11044 1726853249.01305: variable 'ansible_distribution_major_version' from source: facts 11044 1726853249.01414: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853249.01442: variable '__network_wireless_connections_defined' from source: role '' defaults 11044 1726853249.01637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853249.04500: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853249.04575: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853249.04622: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853249.04662: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853249.04694: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853249.04776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853249.04808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853249.04842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.04886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853249.04905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853249.04957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853249.04985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853249.05012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.05057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853249.05145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853249.05148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853249.05151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853249.05175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.05215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853249.05234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853249.05408: variable 'network_connections' from source: task vars 11044 1726853249.05427: variable 'controller_profile' from source: play vars 11044 1726853249.05501: variable 'controller_profile' from source: play vars 11044 1726853249.05516: variable 'controller_device' from source: play vars 11044 1726853249.05585: variable 'controller_device' from source: play vars 11044 1726853249.05601: variable 'port1_profile' from source: play vars 11044 1726853249.05661: variable 'port1_profile' from source: play vars 11044 1726853249.05678: variable 'dhcp_interface1' from source: play vars 11044 1726853249.05776: variable 'dhcp_interface1' from source: play vars 11044 1726853249.05779: variable 'controller_profile' from source: play vars 11044 1726853249.05824: variable 'controller_profile' from source: play vars 11044 1726853249.05837: variable 'port2_profile' from source: play vars 11044 1726853249.05905: variable 'port2_profile' from source: play vars 11044 1726853249.05917: variable 'dhcp_interface2' from source: play vars 11044 1726853249.06012: variable 'dhcp_interface2' from source: play vars 11044 1726853249.06016: variable 'controller_profile' from source: play vars 11044 1726853249.06052: variable 'controller_profile' from source: play vars 11044 1726853249.06229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853249.06382: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853249.06421: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853249.06459: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853249.06494: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853249.06539: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853249.06570: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853249.06663: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.06666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853249.06807: variable '__network_team_connections_defined' from source: role '' defaults 11044 1726853249.07376: variable 'network_connections' from source: task vars 11044 1726853249.07379: variable 'controller_profile' from source: play vars 11044 1726853249.07414: variable 'controller_profile' from source: play vars 11044 1726853249.07678: variable 'controller_device' from source: play vars 11044 1726853249.07681: variable 'controller_device' from source: play vars 11044 1726853249.07684: variable 'port1_profile' from source: play vars 11044 1726853249.07723: variable 'port1_profile' from source: play vars 11044 1726853249.07735: variable 'dhcp_interface1' from source: play vars 11044 1726853249.07911: variable 'dhcp_interface1' from source: play vars 11044 1726853249.07922: variable 'controller_profile' from source: play vars 11044 1726853249.07986: variable 'controller_profile' from source: play vars 11044 1726853249.08010: variable 'port2_profile' from source: play vars 11044 1726853249.08081: variable 'port2_profile' from source: play vars 11044 1726853249.08093: variable 'dhcp_interface2' from source: play vars 11044 1726853249.08157: variable 'dhcp_interface2' from source: play vars 11044 1726853249.08168: variable 'controller_profile' from source: play vars 11044 1726853249.08249: variable 'controller_profile' from source: play vars 11044 1726853249.08287: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11044 1726853249.08296: when evaluation is False, skipping this task 11044 1726853249.08303: _execute() done 11044 1726853249.08310: dumping result to json 11044 1726853249.08317: done dumping result, returning 11044 1726853249.08332: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-c5a6-f857-00000000002d] 11044 1726853249.08340: sending task result for task 02083763-bbaf-c5a6-f857-00000000002d skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11044 1726853249.08492: no more pending results, returning what we have 11044 1726853249.08495: results queue empty 11044 1726853249.08497: checking for any_errors_fatal 11044 1726853249.08504: done checking for any_errors_fatal 11044 1726853249.08504: checking for max_fail_percentage 11044 1726853249.08506: done checking for max_fail_percentage 11044 1726853249.08507: checking to see if all hosts have failed and the running result is not ok 11044 1726853249.08508: done checking to see if all hosts have failed 11044 1726853249.08509: getting the remaining hosts for this loop 11044 1726853249.08510: done getting the remaining hosts for this loop 11044 1726853249.08514: getting the next task for host managed_node1 11044 1726853249.08520: done getting next task for host managed_node1 11044 1726853249.08524: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11044 1726853249.08527: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853249.08541: getting variables 11044 1726853249.08542: in VariableManager get_vars() 11044 1726853249.08587: Calling all_inventory to load vars for managed_node1 11044 1726853249.08590: Calling groups_inventory to load vars for managed_node1 11044 1726853249.08593: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853249.08603: Calling all_plugins_play to load vars for managed_node1 11044 1726853249.08606: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853249.08609: Calling groups_plugins_play to load vars for managed_node1 11044 1726853249.09297: done sending task result for task 02083763-bbaf-c5a6-f857-00000000002d 11044 1726853249.09300: WORKER PROCESS EXITING 11044 1726853249.10944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853249.12842: done with get_vars() 11044 1726853249.12867: done getting variables 11044 1726853249.12927: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:27:29 -0400 (0:00:00.132) 0:00:13.505 ****** 11044 1726853249.12961: entering _queue_task() for managed_node1/package 11044 1726853249.13282: worker is 1 (out of 1 available) 11044 1726853249.13294: exiting _queue_task() for managed_node1/package 11044 1726853249.13309: done queuing things up, now waiting for results queue to drain 11044 1726853249.13310: waiting for pending results... 11044 1726853249.13656: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 11044 1726853249.13824: in run() - task 02083763-bbaf-c5a6-f857-00000000002e 11044 1726853249.13843: variable 'ansible_search_path' from source: unknown 11044 1726853249.13852: variable 'ansible_search_path' from source: unknown 11044 1726853249.13893: calling self._execute() 11044 1726853249.14035: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853249.14051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853249.14067: variable 'omit' from source: magic vars 11044 1726853249.15076: variable 'ansible_distribution_major_version' from source: facts 11044 1726853249.15081: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853249.15402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853249.15894: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853249.15947: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853249.15988: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853249.16025: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853249.16150: variable 'network_packages' from source: role '' defaults 11044 1726853249.16269: variable '__network_provider_setup' from source: role '' defaults 11044 1726853249.16286: variable '__network_service_name_default_nm' from source: role '' defaults 11044 1726853249.16353: variable '__network_service_name_default_nm' from source: role '' defaults 11044 1726853249.16374: variable '__network_packages_default_nm' from source: role '' defaults 11044 1726853249.16437: variable '__network_packages_default_nm' from source: role '' defaults 11044 1726853249.16632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853249.19607: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853249.19756: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853249.19978: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853249.19981: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853249.19984: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853249.20126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853249.20203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853249.20377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.20387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853249.20414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853249.20468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853249.20712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853249.20715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.20718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853249.20720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853249.21163: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11044 1726853249.21436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853249.21501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853249.21727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.21730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853249.21732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853249.21897: variable 'ansible_python' from source: facts 11044 1726853249.21927: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11044 1726853249.22025: variable '__network_wpa_supplicant_required' from source: role '' defaults 11044 1726853249.22237: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11044 1726853249.22504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853249.22604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853249.22633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.22717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853249.22875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853249.22878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853249.22909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853249.23001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.23050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853249.23156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853249.23412: variable 'network_connections' from source: task vars 11044 1726853249.23425: variable 'controller_profile' from source: play vars 11044 1726853249.23649: variable 'controller_profile' from source: play vars 11044 1726853249.23663: variable 'controller_device' from source: play vars 11044 1726853249.23878: variable 'controller_device' from source: play vars 11044 1726853249.23895: variable 'port1_profile' from source: play vars 11044 1726853249.24109: variable 'port1_profile' from source: play vars 11044 1726853249.24125: variable 'dhcp_interface1' from source: play vars 11044 1726853249.24376: variable 'dhcp_interface1' from source: play vars 11044 1726853249.24379: variable 'controller_profile' from source: play vars 11044 1726853249.24590: variable 'controller_profile' from source: play vars 11044 1726853249.24604: variable 'port2_profile' from source: play vars 11044 1726853249.24809: variable 'port2_profile' from source: play vars 11044 1726853249.24824: variable 'dhcp_interface2' from source: play vars 11044 1726853249.25058: variable 'dhcp_interface2' from source: play vars 11044 1726853249.25061: variable 'controller_profile' from source: play vars 11044 1726853249.25266: variable 'controller_profile' from source: play vars 11044 1726853249.25450: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853249.25513: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853249.25548: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.25643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853249.25780: variable '__network_wireless_connections_defined' from source: role '' defaults 11044 1726853249.26338: variable 'network_connections' from source: task vars 11044 1726853249.26383: variable 'controller_profile' from source: play vars 11044 1726853249.26786: variable 'controller_profile' from source: play vars 11044 1726853249.26788: variable 'controller_device' from source: play vars 11044 1726853249.26790: variable 'controller_device' from source: play vars 11044 1726853249.26903: variable 'port1_profile' from source: play vars 11044 1726853249.27031: variable 'port1_profile' from source: play vars 11044 1726853249.27088: variable 'dhcp_interface1' from source: play vars 11044 1726853249.27190: variable 'dhcp_interface1' from source: play vars 11044 1726853249.27338: variable 'controller_profile' from source: play vars 11044 1726853249.27652: variable 'controller_profile' from source: play vars 11044 1726853249.27656: variable 'port2_profile' from source: play vars 11044 1726853249.27779: variable 'port2_profile' from source: play vars 11044 1726853249.27794: variable 'dhcp_interface2' from source: play vars 11044 1726853249.28009: variable 'dhcp_interface2' from source: play vars 11044 1726853249.28023: variable 'controller_profile' from source: play vars 11044 1726853249.28236: variable 'controller_profile' from source: play vars 11044 1726853249.28331: variable '__network_packages_default_wireless' from source: role '' defaults 11044 1726853249.28542: variable '__network_wireless_connections_defined' from source: role '' defaults 11044 1726853249.29222: variable 'network_connections' from source: task vars 11044 1726853249.29232: variable 'controller_profile' from source: play vars 11044 1726853249.29404: variable 'controller_profile' from source: play vars 11044 1726853249.29415: variable 'controller_device' from source: play vars 11044 1726853249.29776: variable 'controller_device' from source: play vars 11044 1726853249.29779: variable 'port1_profile' from source: play vars 11044 1726853249.29782: variable 'port1_profile' from source: play vars 11044 1726853249.29784: variable 'dhcp_interface1' from source: play vars 11044 1726853249.29834: variable 'dhcp_interface1' from source: play vars 11044 1726853249.29849: variable 'controller_profile' from source: play vars 11044 1726853249.29914: variable 'controller_profile' from source: play vars 11044 1726853249.29987: variable 'port2_profile' from source: play vars 11044 1726853249.30177: variable 'port2_profile' from source: play vars 11044 1726853249.30190: variable 'dhcp_interface2' from source: play vars 11044 1726853249.30256: variable 'dhcp_interface2' from source: play vars 11044 1726853249.30476: variable 'controller_profile' from source: play vars 11044 1726853249.30479: variable 'controller_profile' from source: play vars 11044 1726853249.30481: variable '__network_packages_default_team' from source: role '' defaults 11044 1726853249.30558: variable '__network_team_connections_defined' from source: role '' defaults 11044 1726853249.31478: variable 'network_connections' from source: task vars 11044 1726853249.31482: variable 'controller_profile' from source: play vars 11044 1726853249.31485: variable 'controller_profile' from source: play vars 11044 1726853249.31487: variable 'controller_device' from source: play vars 11044 1726853249.31532: variable 'controller_device' from source: play vars 11044 1726853249.31550: variable 'port1_profile' from source: play vars 11044 1726853249.31977: variable 'port1_profile' from source: play vars 11044 1726853249.31980: variable 'dhcp_interface1' from source: play vars 11044 1726853249.31983: variable 'dhcp_interface1' from source: play vars 11044 1726853249.31985: variable 'controller_profile' from source: play vars 11044 1726853249.31987: variable 'controller_profile' from source: play vars 11044 1726853249.31993: variable 'port2_profile' from source: play vars 11044 1726853249.32058: variable 'port2_profile' from source: play vars 11044 1726853249.32376: variable 'dhcp_interface2' from source: play vars 11044 1726853249.32379: variable 'dhcp_interface2' from source: play vars 11044 1726853249.32381: variable 'controller_profile' from source: play vars 11044 1726853249.32420: variable 'controller_profile' from source: play vars 11044 1726853249.32694: variable '__network_service_name_default_initscripts' from source: role '' defaults 11044 1726853249.32759: variable '__network_service_name_default_initscripts' from source: role '' defaults 11044 1726853249.32774: variable '__network_packages_default_initscripts' from source: role '' defaults 11044 1726853249.32832: variable '__network_packages_default_initscripts' from source: role '' defaults 11044 1726853249.33255: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11044 1726853249.34030: variable 'network_connections' from source: task vars 11044 1726853249.34576: variable 'controller_profile' from source: play vars 11044 1726853249.34579: variable 'controller_profile' from source: play vars 11044 1726853249.34587: variable 'controller_device' from source: play vars 11044 1726853249.34589: variable 'controller_device' from source: play vars 11044 1726853249.34591: variable 'port1_profile' from source: play vars 11044 1726853249.34593: variable 'port1_profile' from source: play vars 11044 1726853249.34595: variable 'dhcp_interface1' from source: play vars 11044 1726853249.34826: variable 'dhcp_interface1' from source: play vars 11044 1726853249.34838: variable 'controller_profile' from source: play vars 11044 1726853249.34897: variable 'controller_profile' from source: play vars 11044 1726853249.34909: variable 'port2_profile' from source: play vars 11044 1726853249.34969: variable 'port2_profile' from source: play vars 11044 1726853249.35376: variable 'dhcp_interface2' from source: play vars 11044 1726853249.35379: variable 'dhcp_interface2' from source: play vars 11044 1726853249.35381: variable 'controller_profile' from source: play vars 11044 1726853249.35383: variable 'controller_profile' from source: play vars 11044 1726853249.35385: variable 'ansible_distribution' from source: facts 11044 1726853249.35387: variable '__network_rh_distros' from source: role '' defaults 11044 1726853249.35389: variable 'ansible_distribution_major_version' from source: facts 11044 1726853249.35391: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11044 1726853249.35602: variable 'ansible_distribution' from source: facts 11044 1726853249.35684: variable '__network_rh_distros' from source: role '' defaults 11044 1726853249.35694: variable 'ansible_distribution_major_version' from source: facts 11044 1726853249.35710: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11044 1726853249.35935: variable 'ansible_distribution' from source: facts 11044 1726853249.36083: variable '__network_rh_distros' from source: role '' defaults 11044 1726853249.36093: variable 'ansible_distribution_major_version' from source: facts 11044 1726853249.36134: variable 'network_provider' from source: set_fact 11044 1726853249.36194: variable 'ansible_facts' from source: unknown 11044 1726853249.37438: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11044 1726853249.37681: when evaluation is False, skipping this task 11044 1726853249.37690: _execute() done 11044 1726853249.37698: dumping result to json 11044 1726853249.37705: done dumping result, returning 11044 1726853249.37717: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-c5a6-f857-00000000002e] 11044 1726853249.37725: sending task result for task 02083763-bbaf-c5a6-f857-00000000002e 11044 1726853249.37838: done sending task result for task 02083763-bbaf-c5a6-f857-00000000002e 11044 1726853249.37849: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11044 1726853249.37909: no more pending results, returning what we have 11044 1726853249.37912: results queue empty 11044 1726853249.37913: checking for any_errors_fatal 11044 1726853249.37919: done checking for any_errors_fatal 11044 1726853249.37919: checking for max_fail_percentage 11044 1726853249.37921: done checking for max_fail_percentage 11044 1726853249.37922: checking to see if all hosts have failed and the running result is not ok 11044 1726853249.37923: done checking to see if all hosts have failed 11044 1726853249.37923: getting the remaining hosts for this loop 11044 1726853249.37924: done getting the remaining hosts for this loop 11044 1726853249.37928: getting the next task for host managed_node1 11044 1726853249.37934: done getting next task for host managed_node1 11044 1726853249.37937: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11044 1726853249.37940: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853249.37954: getting variables 11044 1726853249.37955: in VariableManager get_vars() 11044 1726853249.37999: Calling all_inventory to load vars for managed_node1 11044 1726853249.38002: Calling groups_inventory to load vars for managed_node1 11044 1726853249.38005: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853249.38015: Calling all_plugins_play to load vars for managed_node1 11044 1726853249.38018: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853249.38021: Calling groups_plugins_play to load vars for managed_node1 11044 1726853249.42364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853249.46347: done with get_vars() 11044 1726853249.46425: done getting variables 11044 1726853249.46492: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:27:29 -0400 (0:00:00.335) 0:00:13.841 ****** 11044 1726853249.46645: entering _queue_task() for managed_node1/package 11044 1726853249.47406: worker is 1 (out of 1 available) 11044 1726853249.47419: exiting _queue_task() for managed_node1/package 11044 1726853249.47505: done queuing things up, now waiting for results queue to drain 11044 1726853249.47507: waiting for pending results... 11044 1726853249.48086: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11044 1726853249.48408: in run() - task 02083763-bbaf-c5a6-f857-00000000002f 11044 1726853249.48429: variable 'ansible_search_path' from source: unknown 11044 1726853249.48437: variable 'ansible_search_path' from source: unknown 11044 1726853249.48776: calling self._execute() 11044 1726853249.48781: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853249.48784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853249.48797: variable 'omit' from source: magic vars 11044 1726853249.49500: variable 'ansible_distribution_major_version' from source: facts 11044 1726853249.49573: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853249.49695: variable 'network_state' from source: role '' defaults 11044 1726853249.49758: Evaluated conditional (network_state != {}): False 11044 1726853249.49768: when evaluation is False, skipping this task 11044 1726853249.49779: _execute() done 11044 1726853249.49788: dumping result to json 11044 1726853249.49796: done dumping result, returning 11044 1726853249.49808: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-c5a6-f857-00000000002f] 11044 1726853249.49817: sending task result for task 02083763-bbaf-c5a6-f857-00000000002f skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11044 1726853249.49988: no more pending results, returning what we have 11044 1726853249.49992: results queue empty 11044 1726853249.49993: checking for any_errors_fatal 11044 1726853249.49998: done checking for any_errors_fatal 11044 1726853249.49999: checking for max_fail_percentage 11044 1726853249.50001: done checking for max_fail_percentage 11044 1726853249.50002: checking to see if all hosts have failed and the running result is not ok 11044 1726853249.50003: done checking to see if all hosts have failed 11044 1726853249.50004: getting the remaining hosts for this loop 11044 1726853249.50005: done getting the remaining hosts for this loop 11044 1726853249.50009: getting the next task for host managed_node1 11044 1726853249.50016: done getting next task for host managed_node1 11044 1726853249.50020: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11044 1726853249.50024: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853249.50057: getting variables 11044 1726853249.50060: in VariableManager get_vars() 11044 1726853249.50104: Calling all_inventory to load vars for managed_node1 11044 1726853249.50107: Calling groups_inventory to load vars for managed_node1 11044 1726853249.50109: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853249.50120: Calling all_plugins_play to load vars for managed_node1 11044 1726853249.50123: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853249.50125: Calling groups_plugins_play to load vars for managed_node1 11044 1726853249.51228: done sending task result for task 02083763-bbaf-c5a6-f857-00000000002f 11044 1726853249.51231: WORKER PROCESS EXITING 11044 1726853249.52298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853249.58342: done with get_vars() 11044 1726853249.58366: done getting variables 11044 1726853249.58413: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:27:29 -0400 (0:00:00.119) 0:00:13.960 ****** 11044 1726853249.58441: entering _queue_task() for managed_node1/package 11044 1726853249.58785: worker is 1 (out of 1 available) 11044 1726853249.58799: exiting _queue_task() for managed_node1/package 11044 1726853249.58811: done queuing things up, now waiting for results queue to drain 11044 1726853249.58812: waiting for pending results... 11044 1726853249.59095: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11044 1726853249.59253: in run() - task 02083763-bbaf-c5a6-f857-000000000030 11044 1726853249.59275: variable 'ansible_search_path' from source: unknown 11044 1726853249.59295: variable 'ansible_search_path' from source: unknown 11044 1726853249.59336: calling self._execute() 11044 1726853249.59451: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853249.59464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853249.59480: variable 'omit' from source: magic vars 11044 1726853249.60276: variable 'ansible_distribution_major_version' from source: facts 11044 1726853249.60280: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853249.60451: variable 'network_state' from source: role '' defaults 11044 1726853249.60468: Evaluated conditional (network_state != {}): False 11044 1726853249.60523: when evaluation is False, skipping this task 11044 1726853249.60530: _execute() done 11044 1726853249.60537: dumping result to json 11044 1726853249.60546: done dumping result, returning 11044 1726853249.60556: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-c5a6-f857-000000000030] 11044 1726853249.60563: sending task result for task 02083763-bbaf-c5a6-f857-000000000030 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11044 1726853249.60791: no more pending results, returning what we have 11044 1726853249.60796: results queue empty 11044 1726853249.60797: checking for any_errors_fatal 11044 1726853249.60806: done checking for any_errors_fatal 11044 1726853249.60807: checking for max_fail_percentage 11044 1726853249.60809: done checking for max_fail_percentage 11044 1726853249.60810: checking to see if all hosts have failed and the running result is not ok 11044 1726853249.60811: done checking to see if all hosts have failed 11044 1726853249.60812: getting the remaining hosts for this loop 11044 1726853249.60813: done getting the remaining hosts for this loop 11044 1726853249.60818: getting the next task for host managed_node1 11044 1726853249.60825: done getting next task for host managed_node1 11044 1726853249.60830: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11044 1726853249.60834: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853249.60854: getting variables 11044 1726853249.60856: in VariableManager get_vars() 11044 1726853249.60904: Calling all_inventory to load vars for managed_node1 11044 1726853249.60907: Calling groups_inventory to load vars for managed_node1 11044 1726853249.60910: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853249.60926: Calling all_plugins_play to load vars for managed_node1 11044 1726853249.60929: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853249.60933: Calling groups_plugins_play to load vars for managed_node1 11044 1726853249.61802: done sending task result for task 02083763-bbaf-c5a6-f857-000000000030 11044 1726853249.61805: WORKER PROCESS EXITING 11044 1726853249.62744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853249.63877: done with get_vars() 11044 1726853249.63895: done getting variables 11044 1726853249.63968: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:27:29 -0400 (0:00:00.055) 0:00:14.015 ****** 11044 1726853249.63993: entering _queue_task() for managed_node1/service 11044 1726853249.63995: Creating lock for service 11044 1726853249.64243: worker is 1 (out of 1 available) 11044 1726853249.64255: exiting _queue_task() for managed_node1/service 11044 1726853249.64269: done queuing things up, now waiting for results queue to drain 11044 1726853249.64270: waiting for pending results... 11044 1726853249.64443: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11044 1726853249.64533: in run() - task 02083763-bbaf-c5a6-f857-000000000031 11044 1726853249.64544: variable 'ansible_search_path' from source: unknown 11044 1726853249.64547: variable 'ansible_search_path' from source: unknown 11044 1726853249.64577: calling self._execute() 11044 1726853249.64654: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853249.64657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853249.64666: variable 'omit' from source: magic vars 11044 1726853249.65023: variable 'ansible_distribution_major_version' from source: facts 11044 1726853249.65033: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853249.65161: variable '__network_wireless_connections_defined' from source: role '' defaults 11044 1726853249.65449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853249.67822: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853249.67877: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853249.67913: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853249.67943: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853249.67984: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853249.68080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853249.68096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853249.68127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.68197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853249.68201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853249.68268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853249.68273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853249.68326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.68348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853249.68355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853249.68404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853249.68431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853249.68458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.68491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853249.68500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853249.68694: variable 'network_connections' from source: task vars 11044 1726853249.68697: variable 'controller_profile' from source: play vars 11044 1726853249.68769: variable 'controller_profile' from source: play vars 11044 1726853249.68776: variable 'controller_device' from source: play vars 11044 1726853249.68829: variable 'controller_device' from source: play vars 11044 1726853249.68841: variable 'port1_profile' from source: play vars 11044 1726853249.68923: variable 'port1_profile' from source: play vars 11044 1726853249.68927: variable 'dhcp_interface1' from source: play vars 11044 1726853249.69014: variable 'dhcp_interface1' from source: play vars 11044 1726853249.69018: variable 'controller_profile' from source: play vars 11044 1726853249.69066: variable 'controller_profile' from source: play vars 11044 1726853249.69069: variable 'port2_profile' from source: play vars 11044 1726853249.69156: variable 'port2_profile' from source: play vars 11044 1726853249.69159: variable 'dhcp_interface2' from source: play vars 11044 1726853249.69234: variable 'dhcp_interface2' from source: play vars 11044 1726853249.69240: variable 'controller_profile' from source: play vars 11044 1726853249.69243: variable 'controller_profile' from source: play vars 11044 1726853249.69360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853249.69892: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853249.69895: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853249.69898: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853249.69926: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853249.69967: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853249.70015: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853249.70018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.70047: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853249.70125: variable '__network_team_connections_defined' from source: role '' defaults 11044 1726853249.70342: variable 'network_connections' from source: task vars 11044 1726853249.70348: variable 'controller_profile' from source: play vars 11044 1726853249.70405: variable 'controller_profile' from source: play vars 11044 1726853249.70412: variable 'controller_device' from source: play vars 11044 1726853249.70466: variable 'controller_device' from source: play vars 11044 1726853249.70476: variable 'port1_profile' from source: play vars 11044 1726853249.70577: variable 'port1_profile' from source: play vars 11044 1726853249.70581: variable 'dhcp_interface1' from source: play vars 11044 1726853249.70619: variable 'dhcp_interface1' from source: play vars 11044 1726853249.70627: variable 'controller_profile' from source: play vars 11044 1726853249.70731: variable 'controller_profile' from source: play vars 11044 1726853249.70734: variable 'port2_profile' from source: play vars 11044 1726853249.70747: variable 'port2_profile' from source: play vars 11044 1726853249.70754: variable 'dhcp_interface2' from source: play vars 11044 1726853249.70918: variable 'dhcp_interface2' from source: play vars 11044 1726853249.70921: variable 'controller_profile' from source: play vars 11044 1726853249.70924: variable 'controller_profile' from source: play vars 11044 1726853249.70926: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11044 1726853249.70928: when evaluation is False, skipping this task 11044 1726853249.70931: _execute() done 11044 1726853249.70932: dumping result to json 11044 1726853249.70934: done dumping result, returning 11044 1726853249.70936: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-c5a6-f857-000000000031] 11044 1726853249.70938: sending task result for task 02083763-bbaf-c5a6-f857-000000000031 11044 1726853249.71223: done sending task result for task 02083763-bbaf-c5a6-f857-000000000031 11044 1726853249.71226: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11044 1726853249.71319: no more pending results, returning what we have 11044 1726853249.71322: results queue empty 11044 1726853249.71323: checking for any_errors_fatal 11044 1726853249.71330: done checking for any_errors_fatal 11044 1726853249.71331: checking for max_fail_percentage 11044 1726853249.71338: done checking for max_fail_percentage 11044 1726853249.71339: checking to see if all hosts have failed and the running result is not ok 11044 1726853249.71340: done checking to see if all hosts have failed 11044 1726853249.71340: getting the remaining hosts for this loop 11044 1726853249.71341: done getting the remaining hosts for this loop 11044 1726853249.71347: getting the next task for host managed_node1 11044 1726853249.71352: done getting next task for host managed_node1 11044 1726853249.71357: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11044 1726853249.71399: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853249.71414: getting variables 11044 1726853249.71416: in VariableManager get_vars() 11044 1726853249.71517: Calling all_inventory to load vars for managed_node1 11044 1726853249.71520: Calling groups_inventory to load vars for managed_node1 11044 1726853249.71522: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853249.71531: Calling all_plugins_play to load vars for managed_node1 11044 1726853249.71534: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853249.71569: Calling groups_plugins_play to load vars for managed_node1 11044 1726853249.74086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853249.75820: done with get_vars() 11044 1726853249.75847: done getting variables 11044 1726853249.75916: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:27:29 -0400 (0:00:00.119) 0:00:14.135 ****** 11044 1726853249.75950: entering _queue_task() for managed_node1/service 11044 1726853249.76416: worker is 1 (out of 1 available) 11044 1726853249.76429: exiting _queue_task() for managed_node1/service 11044 1726853249.76441: done queuing things up, now waiting for results queue to drain 11044 1726853249.76443: waiting for pending results... 11044 1726853249.76863: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11044 1726853249.76957: in run() - task 02083763-bbaf-c5a6-f857-000000000032 11044 1726853249.76962: variable 'ansible_search_path' from source: unknown 11044 1726853249.76965: variable 'ansible_search_path' from source: unknown 11044 1726853249.77060: calling self._execute() 11044 1726853249.77115: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853249.77128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853249.77142: variable 'omit' from source: magic vars 11044 1726853249.77579: variable 'ansible_distribution_major_version' from source: facts 11044 1726853249.77596: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853249.77788: variable 'network_provider' from source: set_fact 11044 1726853249.77799: variable 'network_state' from source: role '' defaults 11044 1726853249.77828: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11044 1726853249.77842: variable 'omit' from source: magic vars 11044 1726853249.77921: variable 'omit' from source: magic vars 11044 1726853249.78203: variable 'network_service_name' from source: role '' defaults 11044 1726853249.78206: variable 'network_service_name' from source: role '' defaults 11044 1726853249.78318: variable '__network_provider_setup' from source: role '' defaults 11044 1726853249.78329: variable '__network_service_name_default_nm' from source: role '' defaults 11044 1726853249.78392: variable '__network_service_name_default_nm' from source: role '' defaults 11044 1726853249.78407: variable '__network_packages_default_nm' from source: role '' defaults 11044 1726853249.78530: variable '__network_packages_default_nm' from source: role '' defaults 11044 1726853249.79005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853249.82996: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853249.83139: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853249.83148: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853249.83193: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853249.83224: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853249.83320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853249.83467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853249.83473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.83476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853249.83478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853249.83595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853249.83685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853249.83802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.83808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853249.83811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853249.84215: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11044 1726853249.84350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853249.84437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853249.84441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.84525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853249.84546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853249.84655: variable 'ansible_python' from source: facts 11044 1726853249.84699: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11044 1726853249.84798: variable '__network_wpa_supplicant_required' from source: role '' defaults 11044 1726853249.84874: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11044 1726853249.85076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853249.85080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853249.85082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.85144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853249.85194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853249.85262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853249.85326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853249.85366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.85408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853249.85427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853249.85590: variable 'network_connections' from source: task vars 11044 1726853249.85677: variable 'controller_profile' from source: play vars 11044 1726853249.85689: variable 'controller_profile' from source: play vars 11044 1726853249.85733: variable 'controller_device' from source: play vars 11044 1726853249.85911: variable 'controller_device' from source: play vars 11044 1726853249.85914: variable 'port1_profile' from source: play vars 11044 1726853249.85992: variable 'port1_profile' from source: play vars 11044 1726853249.86019: variable 'dhcp_interface1' from source: play vars 11044 1726853249.86203: variable 'dhcp_interface1' from source: play vars 11044 1726853249.86256: variable 'controller_profile' from source: play vars 11044 1726853249.86357: variable 'controller_profile' from source: play vars 11044 1726853249.86364: variable 'port2_profile' from source: play vars 11044 1726853249.86436: variable 'port2_profile' from source: play vars 11044 1726853249.86458: variable 'dhcp_interface2' from source: play vars 11044 1726853249.86592: variable 'dhcp_interface2' from source: play vars 11044 1726853249.86704: variable 'controller_profile' from source: play vars 11044 1726853249.86707: variable 'controller_profile' from source: play vars 11044 1726853249.86890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853249.87402: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853249.87526: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853249.87654: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853249.87896: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853249.87900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853249.87902: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853249.88040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853249.88109: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853249.88375: variable '__network_wireless_connections_defined' from source: role '' defaults 11044 1726853249.88732: variable 'network_connections' from source: task vars 11044 1726853249.88745: variable 'controller_profile' from source: play vars 11044 1726853249.88881: variable 'controller_profile' from source: play vars 11044 1726853249.88903: variable 'controller_device' from source: play vars 11044 1726853249.88986: variable 'controller_device' from source: play vars 11044 1726853249.89034: variable 'port1_profile' from source: play vars 11044 1726853249.89090: variable 'port1_profile' from source: play vars 11044 1726853249.89105: variable 'dhcp_interface1' from source: play vars 11044 1726853249.89450: variable 'dhcp_interface1' from source: play vars 11044 1726853249.89453: variable 'controller_profile' from source: play vars 11044 1726853249.89514: variable 'controller_profile' from source: play vars 11044 1726853249.89531: variable 'port2_profile' from source: play vars 11044 1726853249.89738: variable 'port2_profile' from source: play vars 11044 1726853249.89799: variable 'dhcp_interface2' from source: play vars 11044 1726853249.90187: variable 'dhcp_interface2' from source: play vars 11044 1726853249.90202: variable 'controller_profile' from source: play vars 11044 1726853249.90467: variable 'controller_profile' from source: play vars 11044 1726853249.90622: variable '__network_packages_default_wireless' from source: role '' defaults 11044 1726853249.90768: variable '__network_wireless_connections_defined' from source: role '' defaults 11044 1726853249.91151: variable 'network_connections' from source: task vars 11044 1726853249.91237: variable 'controller_profile' from source: play vars 11044 1726853249.91509: variable 'controller_profile' from source: play vars 11044 1726853249.91513: variable 'controller_device' from source: play vars 11044 1726853249.91623: variable 'controller_device' from source: play vars 11044 1726853249.91627: variable 'port1_profile' from source: play vars 11044 1726853249.92260: variable 'port1_profile' from source: play vars 11044 1726853249.92264: variable 'dhcp_interface1' from source: play vars 11044 1726853249.92266: variable 'dhcp_interface1' from source: play vars 11044 1726853249.92268: variable 'controller_profile' from source: play vars 11044 1726853249.92270: variable 'controller_profile' from source: play vars 11044 1726853249.92274: variable 'port2_profile' from source: play vars 11044 1726853249.92478: variable 'port2_profile' from source: play vars 11044 1726853249.92490: variable 'dhcp_interface2' from source: play vars 11044 1726853249.92573: variable 'dhcp_interface2' from source: play vars 11044 1726853249.92688: variable 'controller_profile' from source: play vars 11044 1726853249.92767: variable 'controller_profile' from source: play vars 11044 1726853249.92864: variable '__network_packages_default_team' from source: role '' defaults 11044 1726853249.92955: variable '__network_team_connections_defined' from source: role '' defaults 11044 1726853249.93302: variable 'network_connections' from source: task vars 11044 1726853249.93318: variable 'controller_profile' from source: play vars 11044 1726853249.93464: variable 'controller_profile' from source: play vars 11044 1726853249.93479: variable 'controller_device' from source: play vars 11044 1726853249.93604: variable 'controller_device' from source: play vars 11044 1726853249.93626: variable 'port1_profile' from source: play vars 11044 1726853249.93696: variable 'port1_profile' from source: play vars 11044 1726853249.93930: variable 'dhcp_interface1' from source: play vars 11044 1726853249.93987: variable 'dhcp_interface1' from source: play vars 11044 1726853249.94003: variable 'controller_profile' from source: play vars 11044 1726853249.94105: variable 'controller_profile' from source: play vars 11044 1726853249.94251: variable 'port2_profile' from source: play vars 11044 1726853249.94439: variable 'port2_profile' from source: play vars 11044 1726853249.94443: variable 'dhcp_interface2' from source: play vars 11044 1726853249.94558: variable 'dhcp_interface2' from source: play vars 11044 1726853249.94579: variable 'controller_profile' from source: play vars 11044 1726853249.94677: variable 'controller_profile' from source: play vars 11044 1726853249.94974: variable '__network_service_name_default_initscripts' from source: role '' defaults 11044 1726853249.95077: variable '__network_service_name_default_initscripts' from source: role '' defaults 11044 1726853249.95080: variable '__network_packages_default_initscripts' from source: role '' defaults 11044 1726853249.95096: variable '__network_packages_default_initscripts' from source: role '' defaults 11044 1726853249.95348: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11044 1726853249.95933: variable 'network_connections' from source: task vars 11044 1726853249.95942: variable 'controller_profile' from source: play vars 11044 1726853249.96042: variable 'controller_profile' from source: play vars 11044 1726853249.96054: variable 'controller_device' from source: play vars 11044 1726853249.96225: variable 'controller_device' from source: play vars 11044 1726853249.96228: variable 'port1_profile' from source: play vars 11044 1726853249.96505: variable 'port1_profile' from source: play vars 11044 1726853249.96508: variable 'dhcp_interface1' from source: play vars 11044 1726853249.96510: variable 'dhcp_interface1' from source: play vars 11044 1726853249.96512: variable 'controller_profile' from source: play vars 11044 1726853249.96601: variable 'controller_profile' from source: play vars 11044 1726853249.96618: variable 'port2_profile' from source: play vars 11044 1726853249.96682: variable 'port2_profile' from source: play vars 11044 1726853249.96693: variable 'dhcp_interface2' from source: play vars 11044 1726853249.96753: variable 'dhcp_interface2' from source: play vars 11044 1726853249.96763: variable 'controller_profile' from source: play vars 11044 1726853249.96817: variable 'controller_profile' from source: play vars 11044 1726853249.96834: variable 'ansible_distribution' from source: facts 11044 1726853249.96841: variable '__network_rh_distros' from source: role '' defaults 11044 1726853249.96852: variable 'ansible_distribution_major_version' from source: facts 11044 1726853249.96886: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11044 1726853249.97090: variable 'ansible_distribution' from source: facts 11044 1726853249.97100: variable '__network_rh_distros' from source: role '' defaults 11044 1726853249.97110: variable 'ansible_distribution_major_version' from source: facts 11044 1726853249.97128: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11044 1726853249.97314: variable 'ansible_distribution' from source: facts 11044 1726853249.97374: variable '__network_rh_distros' from source: role '' defaults 11044 1726853249.97377: variable 'ansible_distribution_major_version' from source: facts 11044 1726853249.97379: variable 'network_provider' from source: set_fact 11044 1726853249.97414: variable 'omit' from source: magic vars 11044 1726853249.97446: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853249.97533: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853249.97558: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853249.97586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853249.97602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853249.97640: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853249.97694: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853249.97697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853249.97766: Set connection var ansible_timeout to 10 11044 1726853249.97783: Set connection var ansible_shell_executable to /bin/sh 11044 1726853249.97790: Set connection var ansible_shell_type to sh 11044 1726853249.97804: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853249.97813: Set connection var ansible_connection to ssh 11044 1726853249.97824: Set connection var ansible_pipelining to False 11044 1726853249.97857: variable 'ansible_shell_executable' from source: unknown 11044 1726853249.97911: variable 'ansible_connection' from source: unknown 11044 1726853249.97914: variable 'ansible_module_compression' from source: unknown 11044 1726853249.97916: variable 'ansible_shell_type' from source: unknown 11044 1726853249.97918: variable 'ansible_shell_executable' from source: unknown 11044 1726853249.97920: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853249.97922: variable 'ansible_pipelining' from source: unknown 11044 1726853249.97924: variable 'ansible_timeout' from source: unknown 11044 1726853249.97926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853249.98021: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853249.98037: variable 'omit' from source: magic vars 11044 1726853249.98052: starting attempt loop 11044 1726853249.98060: running the handler 11044 1726853249.98162: variable 'ansible_facts' from source: unknown 11044 1726853249.99304: _low_level_execute_command(): starting 11044 1726853249.99412: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853250.00897: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853250.01097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853250.01121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853250.01205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853250.02878: stdout chunk (state=3): >>>/root <<< 11044 1726853250.03045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853250.03048: stdout chunk (state=3): >>><<< 11044 1726853250.03057: stderr chunk (state=3): >>><<< 11044 1726853250.03077: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853250.03278: _low_level_execute_command(): starting 11044 1726853250.03282: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853250.0309026-11801-177844174627657 `" && echo ansible-tmp-1726853250.0309026-11801-177844174627657="` echo /root/.ansible/tmp/ansible-tmp-1726853250.0309026-11801-177844174627657 `" ) && sleep 0' 11044 1726853250.03938: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853250.03950: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853250.03985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853250.03996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853250.04072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853250.04107: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853250.04176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853250.06047: stdout chunk (state=3): >>>ansible-tmp-1726853250.0309026-11801-177844174627657=/root/.ansible/tmp/ansible-tmp-1726853250.0309026-11801-177844174627657 <<< 11044 1726853250.06206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853250.06209: stdout chunk (state=3): >>><<< 11044 1726853250.06212: stderr chunk (state=3): >>><<< 11044 1726853250.06288: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853250.0309026-11801-177844174627657=/root/.ansible/tmp/ansible-tmp-1726853250.0309026-11801-177844174627657 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853250.06291: variable 'ansible_module_compression' from source: unknown 11044 1726853250.06335: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 11044 1726853250.06680: ANSIBALLZ: Acquiring lock 11044 1726853250.06684: ANSIBALLZ: Lock acquired: 140360202229168 11044 1726853250.06687: ANSIBALLZ: Creating module 11044 1726853250.58318: ANSIBALLZ: Writing module into payload 11044 1726853250.58447: ANSIBALLZ: Writing module 11044 1726853250.58469: ANSIBALLZ: Renaming module 11044 1726853250.58486: ANSIBALLZ: Done creating module 11044 1726853250.58511: variable 'ansible_facts' from source: unknown 11044 1726853250.58658: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853250.0309026-11801-177844174627657/AnsiballZ_systemd.py 11044 1726853250.58780: Sending initial data 11044 1726853250.58784: Sent initial data (156 bytes) 11044 1726853250.59248: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853250.59252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853250.59257: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853250.59260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853250.59355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853250.59402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853250.61108: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853250.61143: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853250.61192: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpluq7erga /root/.ansible/tmp/ansible-tmp-1726853250.0309026-11801-177844174627657/AnsiballZ_systemd.py <<< 11044 1726853250.61199: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853250.0309026-11801-177844174627657/AnsiballZ_systemd.py" <<< 11044 1726853250.61236: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpluq7erga" to remote "/root/.ansible/tmp/ansible-tmp-1726853250.0309026-11801-177844174627657/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853250.0309026-11801-177844174627657/AnsiballZ_systemd.py" <<< 11044 1726853250.62874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853250.63000: stderr chunk (state=3): >>><<< 11044 1726853250.63003: stdout chunk (state=3): >>><<< 11044 1726853250.63005: done transferring module to remote 11044 1726853250.63008: _low_level_execute_command(): starting 11044 1726853250.63014: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853250.0309026-11801-177844174627657/ /root/.ansible/tmp/ansible-tmp-1726853250.0309026-11801-177844174627657/AnsiballZ_systemd.py && sleep 0' 11044 1726853250.63863: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 11044 1726853250.63866: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853250.63923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853250.63946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853250.64002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853250.65786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853250.65798: stderr chunk (state=3): >>><<< 11044 1726853250.65826: stdout chunk (state=3): >>><<< 11044 1726853250.65833: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853250.65836: _low_level_execute_command(): starting 11044 1726853250.65838: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853250.0309026-11801-177844174627657/AnsiballZ_systemd.py && sleep 0' 11044 1726853250.66601: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853250.66604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853250.66607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853250.66609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853250.66611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853250.66625: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853250.66627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853250.66810: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853250.66813: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 11044 1726853250.66815: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11044 1726853250.66817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853250.66819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853250.66821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853250.66825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853250.66827: stderr chunk (state=3): >>>debug2: match found <<< 11044 1726853250.66828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853250.66830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853250.66832: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853250.66833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853250.66881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853250.96050: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "9977856", "MemoryPeak": "10502144", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3318546432", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "391837000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRec<<< 11044 1726853250.96087: stdout chunk (state=3): >>>eive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-in<<< 11044 1726853250.96092: stdout chunk (state=3): >>>it-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11044 1726853250.97908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853250.97939: stderr chunk (state=3): >>><<< 11044 1726853250.97942: stdout chunk (state=3): >>><<< 11044 1726853250.97957: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "9977856", "MemoryPeak": "10502144", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3318546432", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "391837000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853250.98078: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853250.0309026-11801-177844174627657/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853250.98097: _low_level_execute_command(): starting 11044 1726853250.98100: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853250.0309026-11801-177844174627657/ > /dev/null 2>&1 && sleep 0' 11044 1726853250.98566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853250.98569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853250.98574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11044 1726853250.98577: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853250.98580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853250.98632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853250.98637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853250.98639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853250.98679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853251.00482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853251.00509: stderr chunk (state=3): >>><<< 11044 1726853251.00512: stdout chunk (state=3): >>><<< 11044 1726853251.00524: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853251.00530: handler run complete 11044 1726853251.00573: attempt loop complete, returning result 11044 1726853251.00577: _execute() done 11044 1726853251.00579: dumping result to json 11044 1726853251.00590: done dumping result, returning 11044 1726853251.00602: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-c5a6-f857-000000000032] 11044 1726853251.00605: sending task result for task 02083763-bbaf-c5a6-f857-000000000032 11044 1726853251.01496: done sending task result for task 02083763-bbaf-c5a6-f857-000000000032 11044 1726853251.01498: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11044 1726853251.01549: no more pending results, returning what we have 11044 1726853251.01552: results queue empty 11044 1726853251.01553: checking for any_errors_fatal 11044 1726853251.01557: done checking for any_errors_fatal 11044 1726853251.01558: checking for max_fail_percentage 11044 1726853251.01559: done checking for max_fail_percentage 11044 1726853251.01560: checking to see if all hosts have failed and the running result is not ok 11044 1726853251.01561: done checking to see if all hosts have failed 11044 1726853251.01562: getting the remaining hosts for this loop 11044 1726853251.01563: done getting the remaining hosts for this loop 11044 1726853251.01566: getting the next task for host managed_node1 11044 1726853251.01574: done getting next task for host managed_node1 11044 1726853251.01577: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11044 1726853251.01579: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853251.01589: getting variables 11044 1726853251.01590: in VariableManager get_vars() 11044 1726853251.01634: Calling all_inventory to load vars for managed_node1 11044 1726853251.01637: Calling groups_inventory to load vars for managed_node1 11044 1726853251.01639: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853251.01648: Calling all_plugins_play to load vars for managed_node1 11044 1726853251.01650: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853251.01652: Calling groups_plugins_play to load vars for managed_node1 11044 1726853251.02435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853251.03302: done with get_vars() 11044 1726853251.03317: done getting variables 11044 1726853251.03363: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:27:31 -0400 (0:00:01.274) 0:00:15.409 ****** 11044 1726853251.03388: entering _queue_task() for managed_node1/service 11044 1726853251.03628: worker is 1 (out of 1 available) 11044 1726853251.03639: exiting _queue_task() for managed_node1/service 11044 1726853251.03651: done queuing things up, now waiting for results queue to drain 11044 1726853251.03652: waiting for pending results... 11044 1726853251.03825: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11044 1726853251.03907: in run() - task 02083763-bbaf-c5a6-f857-000000000033 11044 1726853251.03919: variable 'ansible_search_path' from source: unknown 11044 1726853251.03922: variable 'ansible_search_path' from source: unknown 11044 1726853251.03954: calling self._execute() 11044 1726853251.04025: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853251.04030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853251.04038: variable 'omit' from source: magic vars 11044 1726853251.04324: variable 'ansible_distribution_major_version' from source: facts 11044 1726853251.04336: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853251.04412: variable 'network_provider' from source: set_fact 11044 1726853251.04417: Evaluated conditional (network_provider == "nm"): True 11044 1726853251.04483: variable '__network_wpa_supplicant_required' from source: role '' defaults 11044 1726853251.04552: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11044 1726853251.04676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853251.06280: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853251.06325: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853251.06352: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853251.06379: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853251.06401: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853251.06460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853251.06482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853251.06500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853251.06529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853251.06540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853251.06576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853251.06593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853251.06609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853251.06638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853251.06651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853251.06680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853251.06697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853251.06714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853251.06741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853251.06754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853251.06848: variable 'network_connections' from source: task vars 11044 1726853251.06856: variable 'controller_profile' from source: play vars 11044 1726853251.06902: variable 'controller_profile' from source: play vars 11044 1726853251.06911: variable 'controller_device' from source: play vars 11044 1726853251.06955: variable 'controller_device' from source: play vars 11044 1726853251.06966: variable 'port1_profile' from source: play vars 11044 1726853251.07018: variable 'port1_profile' from source: play vars 11044 1726853251.07026: variable 'dhcp_interface1' from source: play vars 11044 1726853251.07074: variable 'dhcp_interface1' from source: play vars 11044 1726853251.07080: variable 'controller_profile' from source: play vars 11044 1726853251.07120: variable 'controller_profile' from source: play vars 11044 1726853251.07126: variable 'port2_profile' from source: play vars 11044 1726853251.07176: variable 'port2_profile' from source: play vars 11044 1726853251.07180: variable 'dhcp_interface2' from source: play vars 11044 1726853251.07217: variable 'dhcp_interface2' from source: play vars 11044 1726853251.07223: variable 'controller_profile' from source: play vars 11044 1726853251.07265: variable 'controller_profile' from source: play vars 11044 1726853251.07317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853251.07426: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853251.07452: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853251.07475: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853251.07501: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853251.07528: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853251.07542: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853251.07560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853251.07584: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853251.07623: variable '__network_wireless_connections_defined' from source: role '' defaults 11044 1726853251.07773: variable 'network_connections' from source: task vars 11044 1726853251.07777: variable 'controller_profile' from source: play vars 11044 1726853251.07826: variable 'controller_profile' from source: play vars 11044 1726853251.07829: variable 'controller_device' from source: play vars 11044 1726853251.07869: variable 'controller_device' from source: play vars 11044 1726853251.07878: variable 'port1_profile' from source: play vars 11044 1726853251.07919: variable 'port1_profile' from source: play vars 11044 1726853251.07926: variable 'dhcp_interface1' from source: play vars 11044 1726853251.07969: variable 'dhcp_interface1' from source: play vars 11044 1726853251.07976: variable 'controller_profile' from source: play vars 11044 1726853251.08017: variable 'controller_profile' from source: play vars 11044 1726853251.08023: variable 'port2_profile' from source: play vars 11044 1726853251.08076: variable 'port2_profile' from source: play vars 11044 1726853251.08082: variable 'dhcp_interface2' from source: play vars 11044 1726853251.08125: variable 'dhcp_interface2' from source: play vars 11044 1726853251.08131: variable 'controller_profile' from source: play vars 11044 1726853251.08175: variable 'controller_profile' from source: play vars 11044 1726853251.08204: Evaluated conditional (__network_wpa_supplicant_required): False 11044 1726853251.08207: when evaluation is False, skipping this task 11044 1726853251.08210: _execute() done 11044 1726853251.08212: dumping result to json 11044 1726853251.08214: done dumping result, returning 11044 1726853251.08224: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-c5a6-f857-000000000033] 11044 1726853251.08227: sending task result for task 02083763-bbaf-c5a6-f857-000000000033 11044 1726853251.08317: done sending task result for task 02083763-bbaf-c5a6-f857-000000000033 11044 1726853251.08320: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11044 1726853251.08387: no more pending results, returning what we have 11044 1726853251.08390: results queue empty 11044 1726853251.08390: checking for any_errors_fatal 11044 1726853251.08408: done checking for any_errors_fatal 11044 1726853251.08409: checking for max_fail_percentage 11044 1726853251.08410: done checking for max_fail_percentage 11044 1726853251.08411: checking to see if all hosts have failed and the running result is not ok 11044 1726853251.08412: done checking to see if all hosts have failed 11044 1726853251.08413: getting the remaining hosts for this loop 11044 1726853251.08414: done getting the remaining hosts for this loop 11044 1726853251.08417: getting the next task for host managed_node1 11044 1726853251.08422: done getting next task for host managed_node1 11044 1726853251.08425: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11044 1726853251.08428: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853251.08441: getting variables 11044 1726853251.08442: in VariableManager get_vars() 11044 1726853251.08481: Calling all_inventory to load vars for managed_node1 11044 1726853251.08483: Calling groups_inventory to load vars for managed_node1 11044 1726853251.08485: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853251.08502: Calling all_plugins_play to load vars for managed_node1 11044 1726853251.08504: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853251.08507: Calling groups_plugins_play to load vars for managed_node1 11044 1726853251.09333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853251.10196: done with get_vars() 11044 1726853251.10210: done getting variables 11044 1726853251.10254: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:27:31 -0400 (0:00:00.068) 0:00:15.478 ****** 11044 1726853251.10277: entering _queue_task() for managed_node1/service 11044 1726853251.10499: worker is 1 (out of 1 available) 11044 1726853251.10512: exiting _queue_task() for managed_node1/service 11044 1726853251.10524: done queuing things up, now waiting for results queue to drain 11044 1726853251.10526: waiting for pending results... 11044 1726853251.10699: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 11044 1726853251.10786: in run() - task 02083763-bbaf-c5a6-f857-000000000034 11044 1726853251.10797: variable 'ansible_search_path' from source: unknown 11044 1726853251.10801: variable 'ansible_search_path' from source: unknown 11044 1726853251.10828: calling self._execute() 11044 1726853251.10903: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853251.10908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853251.10918: variable 'omit' from source: magic vars 11044 1726853251.11200: variable 'ansible_distribution_major_version' from source: facts 11044 1726853251.11212: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853251.11290: variable 'network_provider' from source: set_fact 11044 1726853251.11294: Evaluated conditional (network_provider == "initscripts"): False 11044 1726853251.11297: when evaluation is False, skipping this task 11044 1726853251.11300: _execute() done 11044 1726853251.11302: dumping result to json 11044 1726853251.11304: done dumping result, returning 11044 1726853251.11313: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-c5a6-f857-000000000034] 11044 1726853251.11316: sending task result for task 02083763-bbaf-c5a6-f857-000000000034 11044 1726853251.11400: done sending task result for task 02083763-bbaf-c5a6-f857-000000000034 11044 1726853251.11402: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11044 1726853251.11457: no more pending results, returning what we have 11044 1726853251.11460: results queue empty 11044 1726853251.11461: checking for any_errors_fatal 11044 1726853251.11469: done checking for any_errors_fatal 11044 1726853251.11470: checking for max_fail_percentage 11044 1726853251.11473: done checking for max_fail_percentage 11044 1726853251.11474: checking to see if all hosts have failed and the running result is not ok 11044 1726853251.11475: done checking to see if all hosts have failed 11044 1726853251.11476: getting the remaining hosts for this loop 11044 1726853251.11477: done getting the remaining hosts for this loop 11044 1726853251.11480: getting the next task for host managed_node1 11044 1726853251.11485: done getting next task for host managed_node1 11044 1726853251.11488: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11044 1726853251.11491: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853251.11505: getting variables 11044 1726853251.11506: in VariableManager get_vars() 11044 1726853251.11539: Calling all_inventory to load vars for managed_node1 11044 1726853251.11542: Calling groups_inventory to load vars for managed_node1 11044 1726853251.11544: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853251.11552: Calling all_plugins_play to load vars for managed_node1 11044 1726853251.11554: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853251.11556: Calling groups_plugins_play to load vars for managed_node1 11044 1726853251.12318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853251.13277: done with get_vars() 11044 1726853251.13291: done getting variables 11044 1726853251.13335: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:27:31 -0400 (0:00:00.030) 0:00:15.509 ****** 11044 1726853251.13359: entering _queue_task() for managed_node1/copy 11044 1726853251.13597: worker is 1 (out of 1 available) 11044 1726853251.13611: exiting _queue_task() for managed_node1/copy 11044 1726853251.13623: done queuing things up, now waiting for results queue to drain 11044 1726853251.13625: waiting for pending results... 11044 1726853251.13798: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11044 1726853251.13880: in run() - task 02083763-bbaf-c5a6-f857-000000000035 11044 1726853251.13890: variable 'ansible_search_path' from source: unknown 11044 1726853251.13894: variable 'ansible_search_path' from source: unknown 11044 1726853251.13921: calling self._execute() 11044 1726853251.13992: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853251.13996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853251.14005: variable 'omit' from source: magic vars 11044 1726853251.14284: variable 'ansible_distribution_major_version' from source: facts 11044 1726853251.14299: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853251.14374: variable 'network_provider' from source: set_fact 11044 1726853251.14378: Evaluated conditional (network_provider == "initscripts"): False 11044 1726853251.14381: when evaluation is False, skipping this task 11044 1726853251.14384: _execute() done 11044 1726853251.14386: dumping result to json 11044 1726853251.14391: done dumping result, returning 11044 1726853251.14410: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-c5a6-f857-000000000035] 11044 1726853251.14413: sending task result for task 02083763-bbaf-c5a6-f857-000000000035 11044 1726853251.14491: done sending task result for task 02083763-bbaf-c5a6-f857-000000000035 11044 1726853251.14494: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11044 1726853251.14554: no more pending results, returning what we have 11044 1726853251.14557: results queue empty 11044 1726853251.14558: checking for any_errors_fatal 11044 1726853251.14563: done checking for any_errors_fatal 11044 1726853251.14564: checking for max_fail_percentage 11044 1726853251.14565: done checking for max_fail_percentage 11044 1726853251.14566: checking to see if all hosts have failed and the running result is not ok 11044 1726853251.14567: done checking to see if all hosts have failed 11044 1726853251.14568: getting the remaining hosts for this loop 11044 1726853251.14569: done getting the remaining hosts for this loop 11044 1726853251.14574: getting the next task for host managed_node1 11044 1726853251.14580: done getting next task for host managed_node1 11044 1726853251.14583: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11044 1726853251.14586: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853251.14599: getting variables 11044 1726853251.14600: in VariableManager get_vars() 11044 1726853251.14635: Calling all_inventory to load vars for managed_node1 11044 1726853251.14637: Calling groups_inventory to load vars for managed_node1 11044 1726853251.14639: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853251.14650: Calling all_plugins_play to load vars for managed_node1 11044 1726853251.14652: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853251.14655: Calling groups_plugins_play to load vars for managed_node1 11044 1726853251.15401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853251.16254: done with get_vars() 11044 1726853251.16269: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:27:31 -0400 (0:00:00.029) 0:00:15.539 ****** 11044 1726853251.16333: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 11044 1726853251.16334: Creating lock for fedora.linux_system_roles.network_connections 11044 1726853251.16600: worker is 1 (out of 1 available) 11044 1726853251.16620: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 11044 1726853251.16635: done queuing things up, now waiting for results queue to drain 11044 1726853251.16636: waiting for pending results... 11044 1726853251.16866: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11044 1726853251.16955: in run() - task 02083763-bbaf-c5a6-f857-000000000036 11044 1726853251.16967: variable 'ansible_search_path' from source: unknown 11044 1726853251.16974: variable 'ansible_search_path' from source: unknown 11044 1726853251.17004: calling self._execute() 11044 1726853251.17073: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853251.17077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853251.17090: variable 'omit' from source: magic vars 11044 1726853251.17491: variable 'ansible_distribution_major_version' from source: facts 11044 1726853251.17513: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853251.17527: variable 'omit' from source: magic vars 11044 1726853251.17595: variable 'omit' from source: magic vars 11044 1726853251.17713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853251.19592: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853251.19634: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853251.19664: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853251.19692: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853251.19741: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853251.19876: variable 'network_provider' from source: set_fact 11044 1726853251.19914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853251.20257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853251.20292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853251.20336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853251.20354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853251.20430: variable 'omit' from source: magic vars 11044 1726853251.20540: variable 'omit' from source: magic vars 11044 1726853251.20644: variable 'network_connections' from source: task vars 11044 1726853251.20660: variable 'controller_profile' from source: play vars 11044 1726853251.20723: variable 'controller_profile' from source: play vars 11044 1726853251.20735: variable 'controller_device' from source: play vars 11044 1726853251.20794: variable 'controller_device' from source: play vars 11044 1726853251.20808: variable 'port1_profile' from source: play vars 11044 1726853251.20866: variable 'port1_profile' from source: play vars 11044 1726853251.20880: variable 'dhcp_interface1' from source: play vars 11044 1726853251.20936: variable 'dhcp_interface1' from source: play vars 11044 1726853251.20977: variable 'controller_profile' from source: play vars 11044 1726853251.21009: variable 'controller_profile' from source: play vars 11044 1726853251.21276: variable 'port2_profile' from source: play vars 11044 1726853251.21279: variable 'port2_profile' from source: play vars 11044 1726853251.21281: variable 'dhcp_interface2' from source: play vars 11044 1726853251.21312: variable 'dhcp_interface2' from source: play vars 11044 1726853251.21322: variable 'controller_profile' from source: play vars 11044 1726853251.21448: variable 'controller_profile' from source: play vars 11044 1726853251.21647: variable 'omit' from source: magic vars 11044 1726853251.21669: variable '__lsr_ansible_managed' from source: task vars 11044 1726853251.21732: variable '__lsr_ansible_managed' from source: task vars 11044 1726853251.21960: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11044 1726853251.22114: Loaded config def from plugin (lookup/template) 11044 1726853251.22117: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11044 1726853251.22137: File lookup term: get_ansible_managed.j2 11044 1726853251.22140: variable 'ansible_search_path' from source: unknown 11044 1726853251.22143: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11044 1726853251.22155: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11044 1726853251.22168: variable 'ansible_search_path' from source: unknown 11044 1726853251.26237: variable 'ansible_managed' from source: unknown 11044 1726853251.26476: variable 'omit' from source: magic vars 11044 1726853251.26480: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853251.26483: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853251.26485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853251.26487: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853251.26489: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853251.26492: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853251.26495: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853251.26503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853251.26603: Set connection var ansible_timeout to 10 11044 1726853251.26618: Set connection var ansible_shell_executable to /bin/sh 11044 1726853251.26626: Set connection var ansible_shell_type to sh 11044 1726853251.26637: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853251.26655: Set connection var ansible_connection to ssh 11044 1726853251.26665: Set connection var ansible_pipelining to False 11044 1726853251.26695: variable 'ansible_shell_executable' from source: unknown 11044 1726853251.26703: variable 'ansible_connection' from source: unknown 11044 1726853251.26710: variable 'ansible_module_compression' from source: unknown 11044 1726853251.26717: variable 'ansible_shell_type' from source: unknown 11044 1726853251.26724: variable 'ansible_shell_executable' from source: unknown 11044 1726853251.26732: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853251.26739: variable 'ansible_pipelining' from source: unknown 11044 1726853251.26745: variable 'ansible_timeout' from source: unknown 11044 1726853251.26753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853251.26876: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11044 1726853251.26891: variable 'omit' from source: magic vars 11044 1726853251.26902: starting attempt loop 11044 1726853251.26908: running the handler 11044 1726853251.26925: _low_level_execute_command(): starting 11044 1726853251.26936: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853251.27594: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853251.27610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853251.27625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853251.27643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853251.27661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853251.27677: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853251.27692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853251.27711: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853251.27723: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 11044 1726853251.27790: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853251.27814: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853251.27830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853251.27851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853251.28029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853251.29707: stdout chunk (state=3): >>>/root <<< 11044 1726853251.29903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853251.29906: stdout chunk (state=3): >>><<< 11044 1726853251.29909: stderr chunk (state=3): >>><<< 11044 1726853251.29931: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853251.29959: _low_level_execute_command(): starting 11044 1726853251.29975: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853251.2994292-11851-60569898939839 `" && echo ansible-tmp-1726853251.2994292-11851-60569898939839="` echo /root/.ansible/tmp/ansible-tmp-1726853251.2994292-11851-60569898939839 `" ) && sleep 0' 11044 1726853251.30689: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853251.30703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853251.30788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853251.30842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853251.30859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853251.30886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853251.30960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853251.32834: stdout chunk (state=3): >>>ansible-tmp-1726853251.2994292-11851-60569898939839=/root/.ansible/tmp/ansible-tmp-1726853251.2994292-11851-60569898939839 <<< 11044 1726853251.32942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853251.32993: stderr chunk (state=3): >>><<< 11044 1726853251.32997: stdout chunk (state=3): >>><<< 11044 1726853251.33014: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853251.2994292-11851-60569898939839=/root/.ansible/tmp/ansible-tmp-1726853251.2994292-11851-60569898939839 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853251.33176: variable 'ansible_module_compression' from source: unknown 11044 1726853251.33179: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 11044 1726853251.33182: ANSIBALLZ: Acquiring lock 11044 1726853251.33184: ANSIBALLZ: Lock acquired: 140360199994064 11044 1726853251.33186: ANSIBALLZ: Creating module 11044 1726853251.49890: ANSIBALLZ: Writing module into payload 11044 1726853251.50114: ANSIBALLZ: Writing module 11044 1726853251.50132: ANSIBALLZ: Renaming module 11044 1726853251.50139: ANSIBALLZ: Done creating module 11044 1726853251.50161: variable 'ansible_facts' from source: unknown 11044 1726853251.50250: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853251.2994292-11851-60569898939839/AnsiballZ_network_connections.py 11044 1726853251.50347: Sending initial data 11044 1726853251.50350: Sent initial data (167 bytes) 11044 1726853251.50803: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853251.50808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853251.50810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11044 1726853251.50812: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853251.50815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853251.50862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853251.50865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853251.50916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853251.52582: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853251.52636: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853251.52676: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmphg8uzp44 /root/.ansible/tmp/ansible-tmp-1726853251.2994292-11851-60569898939839/AnsiballZ_network_connections.py <<< 11044 1726853251.52723: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853251.2994292-11851-60569898939839/AnsiballZ_network_connections.py" <<< 11044 1726853251.52763: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmphg8uzp44" to remote "/root/.ansible/tmp/ansible-tmp-1726853251.2994292-11851-60569898939839/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853251.2994292-11851-60569898939839/AnsiballZ_network_connections.py" <<< 11044 1726853251.53996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853251.54080: stderr chunk (state=3): >>><<< 11044 1726853251.54084: stdout chunk (state=3): >>><<< 11044 1726853251.54127: done transferring module to remote 11044 1726853251.54130: _low_level_execute_command(): starting 11044 1726853251.54152: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853251.2994292-11851-60569898939839/ /root/.ansible/tmp/ansible-tmp-1726853251.2994292-11851-60569898939839/AnsiballZ_network_connections.py && sleep 0' 11044 1726853251.54852: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11044 1726853251.54856: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853251.54870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853251.54875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853251.54959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853251.56796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853251.56800: stdout chunk (state=3): >>><<< 11044 1726853251.56806: stderr chunk (state=3): >>><<< 11044 1726853251.56836: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853251.56839: _low_level_execute_command(): starting 11044 1726853251.56846: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853251.2994292-11851-60569898939839/AnsiballZ_network_connections.py && sleep 0' 11044 1726853251.57687: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853251.57750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853251.57753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853251.57773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853251.57874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853251.99769: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d9b8d035-3bc8-441e-9301-200f331b189f\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 8b9437bb-6e81-41e5-9306-532bc96d8ae0\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 89e597e4-eed9-47da-b8eb-c0faec291bf7\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d9b8d035-3bc8-441e-9301-200f331b189f (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 8b9437bb-6e81-41e5-9306-532bc96d8ae0 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 89e597e4-eed9-47da-b8eb-c0faec291bf7 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}}<<< 11044 1726853251.99885: stdout chunk (state=3): >>> <<< 11044 1726853252.01915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853252.01919: stdout chunk (state=3): >>><<< 11044 1726853252.01921: stderr chunk (state=3): >>><<< 11044 1726853252.02077: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d9b8d035-3bc8-441e-9301-200f331b189f\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 8b9437bb-6e81-41e5-9306-532bc96d8ae0\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 89e597e4-eed9-47da-b8eb-c0faec291bf7\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d9b8d035-3bc8-441e-9301-200f331b189f (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 8b9437bb-6e81-41e5-9306-532bc96d8ae0 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 89e597e4-eed9-47da-b8eb-c0faec291bf7 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853252.02081: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'deprecated-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'master': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'master': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853251.2994292-11851-60569898939839/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853252.02083: _low_level_execute_command(): starting 11044 1726853252.02085: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853251.2994292-11851-60569898939839/ > /dev/null 2>&1 && sleep 0' 11044 1726853252.02710: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853252.02724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853252.02739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853252.02876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853252.02902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853252.02987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853252.04939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853252.04943: stdout chunk (state=3): >>><<< 11044 1726853252.04948: stderr chunk (state=3): >>><<< 11044 1726853252.05184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853252.05188: handler run complete 11044 1726853252.05190: attempt loop complete, returning result 11044 1726853252.05192: _execute() done 11044 1726853252.05195: dumping result to json 11044 1726853252.05197: done dumping result, returning 11044 1726853252.05199: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-c5a6-f857-000000000036] 11044 1726853252.05201: sending task result for task 02083763-bbaf-c5a6-f857-000000000036 11044 1726853252.05287: done sending task result for task 02083763-bbaf-c5a6-f857-000000000036 11044 1726853252.05290: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "deprecated-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "interface_name": "test1", "master": "bond0", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "interface_name": "test2", "master": "bond0", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d9b8d035-3bc8-441e-9301-200f331b189f [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 8b9437bb-6e81-41e5-9306-532bc96d8ae0 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 89e597e4-eed9-47da-b8eb-c0faec291bf7 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d9b8d035-3bc8-441e-9301-200f331b189f (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 8b9437bb-6e81-41e5-9306-532bc96d8ae0 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 89e597e4-eed9-47da-b8eb-c0faec291bf7 (not-active) 11044 1726853252.05429: no more pending results, returning what we have 11044 1726853252.05432: results queue empty 11044 1726853252.05433: checking for any_errors_fatal 11044 1726853252.05440: done checking for any_errors_fatal 11044 1726853252.05441: checking for max_fail_percentage 11044 1726853252.05443: done checking for max_fail_percentage 11044 1726853252.05443: checking to see if all hosts have failed and the running result is not ok 11044 1726853252.05447: done checking to see if all hosts have failed 11044 1726853252.05447: getting the remaining hosts for this loop 11044 1726853252.05449: done getting the remaining hosts for this loop 11044 1726853252.05452: getting the next task for host managed_node1 11044 1726853252.05458: done getting next task for host managed_node1 11044 1726853252.05461: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11044 1726853252.05464: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853252.05482: getting variables 11044 1726853252.05485: in VariableManager get_vars() 11044 1726853252.05524: Calling all_inventory to load vars for managed_node1 11044 1726853252.05527: Calling groups_inventory to load vars for managed_node1 11044 1726853252.05529: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853252.05540: Calling all_plugins_play to load vars for managed_node1 11044 1726853252.05543: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853252.05550: Calling groups_plugins_play to load vars for managed_node1 11044 1726853252.07536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853252.09217: done with get_vars() 11044 1726853252.09240: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:27:32 -0400 (0:00:00.929) 0:00:16.469 ****** 11044 1726853252.09335: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 11044 1726853252.09337: Creating lock for fedora.linux_system_roles.network_state 11044 1726853252.09694: worker is 1 (out of 1 available) 11044 1726853252.09707: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 11044 1726853252.09720: done queuing things up, now waiting for results queue to drain 11044 1726853252.09721: waiting for pending results... 11044 1726853252.09909: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 11044 1726853252.09998: in run() - task 02083763-bbaf-c5a6-f857-000000000037 11044 1726853252.10009: variable 'ansible_search_path' from source: unknown 11044 1726853252.10012: variable 'ansible_search_path' from source: unknown 11044 1726853252.10040: calling self._execute() 11044 1726853252.10113: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853252.10117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853252.10125: variable 'omit' from source: magic vars 11044 1726853252.10408: variable 'ansible_distribution_major_version' from source: facts 11044 1726853252.10417: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853252.10505: variable 'network_state' from source: role '' defaults 11044 1726853252.10508: Evaluated conditional (network_state != {}): False 11044 1726853252.10511: when evaluation is False, skipping this task 11044 1726853252.10514: _execute() done 11044 1726853252.10517: dumping result to json 11044 1726853252.10519: done dumping result, returning 11044 1726853252.10527: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-c5a6-f857-000000000037] 11044 1726853252.10529: sending task result for task 02083763-bbaf-c5a6-f857-000000000037 11044 1726853252.10618: done sending task result for task 02083763-bbaf-c5a6-f857-000000000037 11044 1726853252.10621: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11044 1726853252.10668: no more pending results, returning what we have 11044 1726853252.10673: results queue empty 11044 1726853252.10674: checking for any_errors_fatal 11044 1726853252.10685: done checking for any_errors_fatal 11044 1726853252.10685: checking for max_fail_percentage 11044 1726853252.10687: done checking for max_fail_percentage 11044 1726853252.10688: checking to see if all hosts have failed and the running result is not ok 11044 1726853252.10689: done checking to see if all hosts have failed 11044 1726853252.10690: getting the remaining hosts for this loop 11044 1726853252.10691: done getting the remaining hosts for this loop 11044 1726853252.10694: getting the next task for host managed_node1 11044 1726853252.10700: done getting next task for host managed_node1 11044 1726853252.10703: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11044 1726853252.10706: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853252.10720: getting variables 11044 1726853252.10721: in VariableManager get_vars() 11044 1726853252.10755: Calling all_inventory to load vars for managed_node1 11044 1726853252.10757: Calling groups_inventory to load vars for managed_node1 11044 1726853252.10759: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853252.10767: Calling all_plugins_play to load vars for managed_node1 11044 1726853252.10769: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853252.10781: Calling groups_plugins_play to load vars for managed_node1 11044 1726853252.11623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853252.13012: done with get_vars() 11044 1726853252.13035: done getting variables 11044 1726853252.13096: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:27:32 -0400 (0:00:00.037) 0:00:16.507 ****** 11044 1726853252.13128: entering _queue_task() for managed_node1/debug 11044 1726853252.13458: worker is 1 (out of 1 available) 11044 1726853252.13474: exiting _queue_task() for managed_node1/debug 11044 1726853252.13486: done queuing things up, now waiting for results queue to drain 11044 1726853252.13488: waiting for pending results... 11044 1726853252.13665: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11044 1726853252.13755: in run() - task 02083763-bbaf-c5a6-f857-000000000038 11044 1726853252.13767: variable 'ansible_search_path' from source: unknown 11044 1726853252.13770: variable 'ansible_search_path' from source: unknown 11044 1726853252.13801: calling self._execute() 11044 1726853252.13870: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853252.13877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853252.13884: variable 'omit' from source: magic vars 11044 1726853252.14149: variable 'ansible_distribution_major_version' from source: facts 11044 1726853252.14166: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853252.14169: variable 'omit' from source: magic vars 11044 1726853252.14208: variable 'omit' from source: magic vars 11044 1726853252.14233: variable 'omit' from source: magic vars 11044 1726853252.14267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853252.14297: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853252.14313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853252.14326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853252.14336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853252.14364: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853252.14368: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853252.14372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853252.14438: Set connection var ansible_timeout to 10 11044 1726853252.14445: Set connection var ansible_shell_executable to /bin/sh 11044 1726853252.14450: Set connection var ansible_shell_type to sh 11044 1726853252.14456: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853252.14461: Set connection var ansible_connection to ssh 11044 1726853252.14465: Set connection var ansible_pipelining to False 11044 1726853252.14485: variable 'ansible_shell_executable' from source: unknown 11044 1726853252.14490: variable 'ansible_connection' from source: unknown 11044 1726853252.14493: variable 'ansible_module_compression' from source: unknown 11044 1726853252.14496: variable 'ansible_shell_type' from source: unknown 11044 1726853252.14499: variable 'ansible_shell_executable' from source: unknown 11044 1726853252.14502: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853252.14504: variable 'ansible_pipelining' from source: unknown 11044 1726853252.14507: variable 'ansible_timeout' from source: unknown 11044 1726853252.14509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853252.14605: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853252.14612: variable 'omit' from source: magic vars 11044 1726853252.14619: starting attempt loop 11044 1726853252.14622: running the handler 11044 1726853252.14717: variable '__network_connections_result' from source: set_fact 11044 1726853252.14768: handler run complete 11044 1726853252.14782: attempt loop complete, returning result 11044 1726853252.14785: _execute() done 11044 1726853252.14790: dumping result to json 11044 1726853252.14792: done dumping result, returning 11044 1726853252.14799: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-c5a6-f857-000000000038] 11044 1726853252.14802: sending task result for task 02083763-bbaf-c5a6-f857-000000000038 11044 1726853252.14888: done sending task result for task 02083763-bbaf-c5a6-f857-000000000038 11044 1726853252.14890: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d9b8d035-3bc8-441e-9301-200f331b189f", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 8b9437bb-6e81-41e5-9306-532bc96d8ae0", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 89e597e4-eed9-47da-b8eb-c0faec291bf7", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d9b8d035-3bc8-441e-9301-200f331b189f (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 8b9437bb-6e81-41e5-9306-532bc96d8ae0 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 89e597e4-eed9-47da-b8eb-c0faec291bf7 (not-active)" ] } 11044 1726853252.14959: no more pending results, returning what we have 11044 1726853252.14962: results queue empty 11044 1726853252.14963: checking for any_errors_fatal 11044 1726853252.14969: done checking for any_errors_fatal 11044 1726853252.14970: checking for max_fail_percentage 11044 1726853252.14974: done checking for max_fail_percentage 11044 1726853252.14974: checking to see if all hosts have failed and the running result is not ok 11044 1726853252.14975: done checking to see if all hosts have failed 11044 1726853252.14976: getting the remaining hosts for this loop 11044 1726853252.14977: done getting the remaining hosts for this loop 11044 1726853252.14981: getting the next task for host managed_node1 11044 1726853252.14985: done getting next task for host managed_node1 11044 1726853252.14988: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11044 1726853252.14991: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853252.15002: getting variables 11044 1726853252.15003: in VariableManager get_vars() 11044 1726853252.15037: Calling all_inventory to load vars for managed_node1 11044 1726853252.15039: Calling groups_inventory to load vars for managed_node1 11044 1726853252.15041: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853252.15049: Calling all_plugins_play to load vars for managed_node1 11044 1726853252.15051: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853252.15054: Calling groups_plugins_play to load vars for managed_node1 11044 1726853252.16338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853252.17233: done with get_vars() 11044 1726853252.17254: done getting variables 11044 1726853252.17299: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:27:32 -0400 (0:00:00.041) 0:00:16.549 ****** 11044 1726853252.17326: entering _queue_task() for managed_node1/debug 11044 1726853252.17579: worker is 1 (out of 1 available) 11044 1726853252.17592: exiting _queue_task() for managed_node1/debug 11044 1726853252.17606: done queuing things up, now waiting for results queue to drain 11044 1726853252.17607: waiting for pending results... 11044 1726853252.17785: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11044 1726853252.17868: in run() - task 02083763-bbaf-c5a6-f857-000000000039 11044 1726853252.17881: variable 'ansible_search_path' from source: unknown 11044 1726853252.17884: variable 'ansible_search_path' from source: unknown 11044 1726853252.17914: calling self._execute() 11044 1726853252.17984: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853252.17989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853252.17998: variable 'omit' from source: magic vars 11044 1726853252.18274: variable 'ansible_distribution_major_version' from source: facts 11044 1726853252.18284: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853252.18287: variable 'omit' from source: magic vars 11044 1726853252.18322: variable 'omit' from source: magic vars 11044 1726853252.18347: variable 'omit' from source: magic vars 11044 1726853252.18380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853252.18409: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853252.18431: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853252.18446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853252.18455: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853252.18482: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853252.18485: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853252.18488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853252.18555: Set connection var ansible_timeout to 10 11044 1726853252.18562: Set connection var ansible_shell_executable to /bin/sh 11044 1726853252.18565: Set connection var ansible_shell_type to sh 11044 1726853252.18569: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853252.18576: Set connection var ansible_connection to ssh 11044 1726853252.18581: Set connection var ansible_pipelining to False 11044 1726853252.18601: variable 'ansible_shell_executable' from source: unknown 11044 1726853252.18606: variable 'ansible_connection' from source: unknown 11044 1726853252.18609: variable 'ansible_module_compression' from source: unknown 11044 1726853252.18612: variable 'ansible_shell_type' from source: unknown 11044 1726853252.18614: variable 'ansible_shell_executable' from source: unknown 11044 1726853252.18616: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853252.18618: variable 'ansible_pipelining' from source: unknown 11044 1726853252.18620: variable 'ansible_timeout' from source: unknown 11044 1726853252.18623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853252.18775: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853252.18779: variable 'omit' from source: magic vars 11044 1726853252.18781: starting attempt loop 11044 1726853252.18784: running the handler 11044 1726853252.18811: variable '__network_connections_result' from source: set_fact 11044 1726853252.18891: variable '__network_connections_result' from source: set_fact 11044 1726853252.19088: handler run complete 11044 1726853252.19093: attempt loop complete, returning result 11044 1726853252.19095: _execute() done 11044 1726853252.19105: dumping result to json 11044 1726853252.19108: done dumping result, returning 11044 1726853252.19112: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-c5a6-f857-000000000039] 11044 1726853252.19118: sending task result for task 02083763-bbaf-c5a6-f857-000000000039 11044 1726853252.19309: done sending task result for task 02083763-bbaf-c5a6-f857-000000000039 11044 1726853252.19312: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "deprecated-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "interface_name": "test1", "master": "bond0", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "interface_name": "test2", "master": "bond0", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d9b8d035-3bc8-441e-9301-200f331b189f\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 8b9437bb-6e81-41e5-9306-532bc96d8ae0\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 89e597e4-eed9-47da-b8eb-c0faec291bf7\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d9b8d035-3bc8-441e-9301-200f331b189f (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 8b9437bb-6e81-41e5-9306-532bc96d8ae0 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 89e597e4-eed9-47da-b8eb-c0faec291bf7 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d9b8d035-3bc8-441e-9301-200f331b189f", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 8b9437bb-6e81-41e5-9306-532bc96d8ae0", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 89e597e4-eed9-47da-b8eb-c0faec291bf7", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d9b8d035-3bc8-441e-9301-200f331b189f (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 8b9437bb-6e81-41e5-9306-532bc96d8ae0 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 89e597e4-eed9-47da-b8eb-c0faec291bf7 (not-active)" ] } } 11044 1726853252.19452: no more pending results, returning what we have 11044 1726853252.19455: results queue empty 11044 1726853252.19461: checking for any_errors_fatal 11044 1726853252.19487: done checking for any_errors_fatal 11044 1726853252.19488: checking for max_fail_percentage 11044 1726853252.19490: done checking for max_fail_percentage 11044 1726853252.19491: checking to see if all hosts have failed and the running result is not ok 11044 1726853252.19492: done checking to see if all hosts have failed 11044 1726853252.19492: getting the remaining hosts for this loop 11044 1726853252.19493: done getting the remaining hosts for this loop 11044 1726853252.19496: getting the next task for host managed_node1 11044 1726853252.19500: done getting next task for host managed_node1 11044 1726853252.19502: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11044 1726853252.19504: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853252.19511: getting variables 11044 1726853252.19512: in VariableManager get_vars() 11044 1726853252.19554: Calling all_inventory to load vars for managed_node1 11044 1726853252.19557: Calling groups_inventory to load vars for managed_node1 11044 1726853252.19560: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853252.19568: Calling all_plugins_play to load vars for managed_node1 11044 1726853252.19577: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853252.19581: Calling groups_plugins_play to load vars for managed_node1 11044 1726853252.20672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853252.22157: done with get_vars() 11044 1726853252.22180: done getting variables 11044 1726853252.22229: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:27:32 -0400 (0:00:00.049) 0:00:16.598 ****** 11044 1726853252.22282: entering _queue_task() for managed_node1/debug 11044 1726853252.22614: worker is 1 (out of 1 available) 11044 1726853252.22628: exiting _queue_task() for managed_node1/debug 11044 1726853252.22641: done queuing things up, now waiting for results queue to drain 11044 1726853252.22642: waiting for pending results... 11044 1726853252.22869: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11044 1726853252.23009: in run() - task 02083763-bbaf-c5a6-f857-00000000003a 11044 1726853252.23020: variable 'ansible_search_path' from source: unknown 11044 1726853252.23024: variable 'ansible_search_path' from source: unknown 11044 1726853252.23054: calling self._execute() 11044 1726853252.23145: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853252.23153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853252.23161: variable 'omit' from source: magic vars 11044 1726853252.23506: variable 'ansible_distribution_major_version' from source: facts 11044 1726853252.23516: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853252.23637: variable 'network_state' from source: role '' defaults 11044 1726853252.23648: Evaluated conditional (network_state != {}): False 11044 1726853252.23651: when evaluation is False, skipping this task 11044 1726853252.23654: _execute() done 11044 1726853252.23656: dumping result to json 11044 1726853252.23662: done dumping result, returning 11044 1726853252.23670: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-c5a6-f857-00000000003a] 11044 1726853252.23674: sending task result for task 02083763-bbaf-c5a6-f857-00000000003a 11044 1726853252.23769: done sending task result for task 02083763-bbaf-c5a6-f857-00000000003a 11044 1726853252.23774: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 11044 1726853252.23837: no more pending results, returning what we have 11044 1726853252.23848: results queue empty 11044 1726853252.23849: checking for any_errors_fatal 11044 1726853252.23860: done checking for any_errors_fatal 11044 1726853252.23861: checking for max_fail_percentage 11044 1726853252.23862: done checking for max_fail_percentage 11044 1726853252.23864: checking to see if all hosts have failed and the running result is not ok 11044 1726853252.23866: done checking to see if all hosts have failed 11044 1726853252.23867: getting the remaining hosts for this loop 11044 1726853252.23868: done getting the remaining hosts for this loop 11044 1726853252.23873: getting the next task for host managed_node1 11044 1726853252.23878: done getting next task for host managed_node1 11044 1726853252.23882: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11044 1726853252.23885: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853252.23899: getting variables 11044 1726853252.23900: in VariableManager get_vars() 11044 1726853252.23932: Calling all_inventory to load vars for managed_node1 11044 1726853252.23934: Calling groups_inventory to load vars for managed_node1 11044 1726853252.23936: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853252.23947: Calling all_plugins_play to load vars for managed_node1 11044 1726853252.23950: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853252.23952: Calling groups_plugins_play to load vars for managed_node1 11044 1726853252.24788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853252.26033: done with get_vars() 11044 1726853252.26055: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:27:32 -0400 (0:00:00.038) 0:00:16.637 ****** 11044 1726853252.26127: entering _queue_task() for managed_node1/ping 11044 1726853252.26128: Creating lock for ping 11044 1726853252.26385: worker is 1 (out of 1 available) 11044 1726853252.26399: exiting _queue_task() for managed_node1/ping 11044 1726853252.26415: done queuing things up, now waiting for results queue to drain 11044 1726853252.26417: waiting for pending results... 11044 1726853252.26699: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 11044 1726853252.26809: in run() - task 02083763-bbaf-c5a6-f857-00000000003b 11044 1726853252.26813: variable 'ansible_search_path' from source: unknown 11044 1726853252.26816: variable 'ansible_search_path' from source: unknown 11044 1726853252.26864: calling self._execute() 11044 1726853252.26963: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853252.26983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853252.26987: variable 'omit' from source: magic vars 11044 1726853252.27326: variable 'ansible_distribution_major_version' from source: facts 11044 1726853252.27346: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853252.27366: variable 'omit' from source: magic vars 11044 1726853252.27412: variable 'omit' from source: magic vars 11044 1726853252.27442: variable 'omit' from source: magic vars 11044 1726853252.27511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853252.27535: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853252.27542: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853252.27560: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853252.27573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853252.27602: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853252.27621: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853252.27628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853252.27701: Set connection var ansible_timeout to 10 11044 1726853252.27717: Set connection var ansible_shell_executable to /bin/sh 11044 1726853252.27720: Set connection var ansible_shell_type to sh 11044 1726853252.27723: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853252.27725: Set connection var ansible_connection to ssh 11044 1726853252.27727: Set connection var ansible_pipelining to False 11044 1726853252.27782: variable 'ansible_shell_executable' from source: unknown 11044 1726853252.27785: variable 'ansible_connection' from source: unknown 11044 1726853252.27788: variable 'ansible_module_compression' from source: unknown 11044 1726853252.27795: variable 'ansible_shell_type' from source: unknown 11044 1726853252.27802: variable 'ansible_shell_executable' from source: unknown 11044 1726853252.27804: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853252.27806: variable 'ansible_pipelining' from source: unknown 11044 1726853252.27808: variable 'ansible_timeout' from source: unknown 11044 1726853252.27810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853252.27984: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11044 1726853252.27991: variable 'omit' from source: magic vars 11044 1726853252.27996: starting attempt loop 11044 1726853252.27999: running the handler 11044 1726853252.28010: _low_level_execute_command(): starting 11044 1726853252.28018: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853252.28959: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853252.29146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853252.29221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853252.30888: stdout chunk (state=3): >>>/root <<< 11044 1726853252.31081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853252.31084: stderr chunk (state=3): >>><<< 11044 1726853252.31089: stdout chunk (state=3): >>><<< 11044 1726853252.31106: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853252.31131: _low_level_execute_command(): starting 11044 1726853252.31140: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853252.3110914-11897-158759480362503 `" && echo ansible-tmp-1726853252.3110914-11897-158759480362503="` echo /root/.ansible/tmp/ansible-tmp-1726853252.3110914-11897-158759480362503 `" ) && sleep 0' 11044 1726853252.31640: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853252.31667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853252.31700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853252.31748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853252.31752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853252.31756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853252.31797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853252.33683: stdout chunk (state=3): >>>ansible-tmp-1726853252.3110914-11897-158759480362503=/root/.ansible/tmp/ansible-tmp-1726853252.3110914-11897-158759480362503 <<< 11044 1726853252.33852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853252.33856: stderr chunk (state=3): >>><<< 11044 1726853252.33859: stdout chunk (state=3): >>><<< 11044 1726853252.33882: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853252.3110914-11897-158759480362503=/root/.ansible/tmp/ansible-tmp-1726853252.3110914-11897-158759480362503 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853252.33941: variable 'ansible_module_compression' from source: unknown 11044 1726853252.34076: ANSIBALLZ: Using lock for ping 11044 1726853252.34079: ANSIBALLZ: Acquiring lock 11044 1726853252.34081: ANSIBALLZ: Lock acquired: 140360200269712 11044 1726853252.34083: ANSIBALLZ: Creating module 11044 1726853252.46879: ANSIBALLZ: Writing module into payload 11044 1726853252.46883: ANSIBALLZ: Writing module 11044 1726853252.46885: ANSIBALLZ: Renaming module 11044 1726853252.46887: ANSIBALLZ: Done creating module 11044 1726853252.46889: variable 'ansible_facts' from source: unknown 11044 1726853252.46958: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853252.3110914-11897-158759480362503/AnsiballZ_ping.py 11044 1726853252.47139: Sending initial data 11044 1726853252.47143: Sent initial data (153 bytes) 11044 1726853252.47817: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853252.47826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853252.47892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853252.47905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853252.47926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853252.47988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853252.49981: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853252.49986: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853252.49990: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpk0c3xhbg /root/.ansible/tmp/ansible-tmp-1726853252.3110914-11897-158759480362503/AnsiballZ_ping.py <<< 11044 1726853252.49992: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853252.3110914-11897-158759480362503/AnsiballZ_ping.py" <<< 11044 1726853252.49995: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpk0c3xhbg" to remote "/root/.ansible/tmp/ansible-tmp-1726853252.3110914-11897-158759480362503/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853252.3110914-11897-158759480362503/AnsiballZ_ping.py" <<< 11044 1726853252.50665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853252.50739: stderr chunk (state=3): >>><<< 11044 1726853252.50747: stdout chunk (state=3): >>><<< 11044 1726853252.50812: done transferring module to remote 11044 1726853252.50976: _low_level_execute_command(): starting 11044 1726853252.50980: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853252.3110914-11897-158759480362503/ /root/.ansible/tmp/ansible-tmp-1726853252.3110914-11897-158759480362503/AnsiballZ_ping.py && sleep 0' 11044 1726853252.51491: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853252.51595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853252.51617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853252.51681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853252.53652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853252.53655: stderr chunk (state=3): >>><<< 11044 1726853252.53657: stdout chunk (state=3): >>><<< 11044 1726853252.53659: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853252.53662: _low_level_execute_command(): starting 11044 1726853252.53663: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853252.3110914-11897-158759480362503/AnsiballZ_ping.py && sleep 0' 11044 1726853252.54366: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853252.54421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853252.54425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853252.54474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853252.54523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853252.69767: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11044 1726853252.71184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853252.71188: stdout chunk (state=3): >>><<< 11044 1726853252.71190: stderr chunk (state=3): >>><<< 11044 1726853252.71193: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853252.71196: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853252.3110914-11897-158759480362503/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853252.71198: _low_level_execute_command(): starting 11044 1726853252.71200: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853252.3110914-11897-158759480362503/ > /dev/null 2>&1 && sleep 0' 11044 1726853252.71819: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853252.71826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853252.71829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853252.71831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853252.71834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853252.71835: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853252.71837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853252.71839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853252.71841: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 11044 1726853252.71842: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11044 1726853252.71964: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853252.71968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853252.72154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853252.73982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853252.73986: stdout chunk (state=3): >>><<< 11044 1726853252.73989: stderr chunk (state=3): >>><<< 11044 1726853252.73991: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853252.73999: handler run complete 11044 1726853252.74002: attempt loop complete, returning result 11044 1726853252.74004: _execute() done 11044 1726853252.74007: dumping result to json 11044 1726853252.74008: done dumping result, returning 11044 1726853252.74010: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-c5a6-f857-00000000003b] 11044 1726853252.74012: sending task result for task 02083763-bbaf-c5a6-f857-00000000003b ok: [managed_node1] => { "changed": false, "ping": "pong" } 11044 1726853252.74356: no more pending results, returning what we have 11044 1726853252.74359: results queue empty 11044 1726853252.74360: checking for any_errors_fatal 11044 1726853252.74365: done checking for any_errors_fatal 11044 1726853252.74366: checking for max_fail_percentage 11044 1726853252.74368: done checking for max_fail_percentage 11044 1726853252.74369: checking to see if all hosts have failed and the running result is not ok 11044 1726853252.74369: done checking to see if all hosts have failed 11044 1726853252.74370: getting the remaining hosts for this loop 11044 1726853252.74377: done getting the remaining hosts for this loop 11044 1726853252.74384: getting the next task for host managed_node1 11044 1726853252.74394: done getting next task for host managed_node1 11044 1726853252.74396: ^ task is: TASK: meta (role_complete) 11044 1726853252.74399: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853252.74480: getting variables 11044 1726853252.74482: in VariableManager get_vars() 11044 1726853252.74528: Calling all_inventory to load vars for managed_node1 11044 1726853252.74531: Calling groups_inventory to load vars for managed_node1 11044 1726853252.74534: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853252.74578: done sending task result for task 02083763-bbaf-c5a6-f857-00000000003b 11044 1726853252.74582: WORKER PROCESS EXITING 11044 1726853252.74591: Calling all_plugins_play to load vars for managed_node1 11044 1726853252.74600: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853252.74605: Calling groups_plugins_play to load vars for managed_node1 11044 1726853252.78330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853252.81734: done with get_vars() 11044 1726853252.81888: done getting variables 11044 1726853252.82095: done queuing things up, now waiting for results queue to drain 11044 1726853252.82097: results queue empty 11044 1726853252.82098: checking for any_errors_fatal 11044 1726853252.82101: done checking for any_errors_fatal 11044 1726853252.82102: checking for max_fail_percentage 11044 1726853252.82103: done checking for max_fail_percentage 11044 1726853252.82103: checking to see if all hosts have failed and the running result is not ok 11044 1726853252.82104: done checking to see if all hosts have failed 11044 1726853252.82105: getting the remaining hosts for this loop 11044 1726853252.82106: done getting the remaining hosts for this loop 11044 1726853252.82108: getting the next task for host managed_node1 11044 1726853252.82113: done getting next task for host managed_node1 11044 1726853252.82116: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11044 1726853252.82118: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853252.82121: getting variables 11044 1726853252.82122: in VariableManager get_vars() 11044 1726853252.82138: Calling all_inventory to load vars for managed_node1 11044 1726853252.82141: Calling groups_inventory to load vars for managed_node1 11044 1726853252.82143: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853252.82151: Calling all_plugins_play to load vars for managed_node1 11044 1726853252.82153: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853252.82156: Calling groups_plugins_play to load vars for managed_node1 11044 1726853252.84633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853252.86144: done with get_vars() 11044 1726853252.86167: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:27:32 -0400 (0:00:00.601) 0:00:17.238 ****** 11044 1726853252.86232: entering _queue_task() for managed_node1/include_tasks 11044 1726853252.86500: worker is 1 (out of 1 available) 11044 1726853252.86519: exiting _queue_task() for managed_node1/include_tasks 11044 1726853252.86532: done queuing things up, now waiting for results queue to drain 11044 1726853252.86534: waiting for pending results... 11044 1726853252.86705: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 11044 1726853252.86826: in run() - task 02083763-bbaf-c5a6-f857-00000000006e 11044 1726853252.86834: variable 'ansible_search_path' from source: unknown 11044 1726853252.86837: variable 'ansible_search_path' from source: unknown 11044 1726853252.86847: calling self._execute() 11044 1726853252.86931: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853252.86936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853252.86944: variable 'omit' from source: magic vars 11044 1726853252.87289: variable 'ansible_distribution_major_version' from source: facts 11044 1726853252.87308: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853252.87313: _execute() done 11044 1726853252.87317: dumping result to json 11044 1726853252.87319: done dumping result, returning 11044 1726853252.87322: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-c5a6-f857-00000000006e] 11044 1726853252.87324: sending task result for task 02083763-bbaf-c5a6-f857-00000000006e 11044 1726853252.87420: done sending task result for task 02083763-bbaf-c5a6-f857-00000000006e 11044 1726853252.87423: WORKER PROCESS EXITING 11044 1726853252.87472: no more pending results, returning what we have 11044 1726853252.87477: in VariableManager get_vars() 11044 1726853252.87527: Calling all_inventory to load vars for managed_node1 11044 1726853252.87529: Calling groups_inventory to load vars for managed_node1 11044 1726853252.87532: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853252.87543: Calling all_plugins_play to load vars for managed_node1 11044 1726853252.87546: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853252.87549: Calling groups_plugins_play to load vars for managed_node1 11044 1726853252.89973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853252.90860: done with get_vars() 11044 1726853252.90878: variable 'ansible_search_path' from source: unknown 11044 1726853252.90880: variable 'ansible_search_path' from source: unknown 11044 1726853252.90906: we have included files to process 11044 1726853252.90907: generating all_blocks data 11044 1726853252.90908: done generating all_blocks data 11044 1726853252.90912: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11044 1726853252.90913: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11044 1726853252.90915: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11044 1726853252.91051: done processing included file 11044 1726853252.91053: iterating over new_blocks loaded from include file 11044 1726853252.91054: in VariableManager get_vars() 11044 1726853252.91070: done with get_vars() 11044 1726853252.91073: filtering new block on tags 11044 1726853252.91088: done filtering new block on tags 11044 1726853252.91089: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 11044 1726853252.91093: extending task lists for all hosts with included blocks 11044 1726853252.91151: done extending task lists 11044 1726853252.91152: done processing included files 11044 1726853252.91153: results queue empty 11044 1726853252.91153: checking for any_errors_fatal 11044 1726853252.91154: done checking for any_errors_fatal 11044 1726853252.91154: checking for max_fail_percentage 11044 1726853252.91155: done checking for max_fail_percentage 11044 1726853252.91155: checking to see if all hosts have failed and the running result is not ok 11044 1726853252.91156: done checking to see if all hosts have failed 11044 1726853252.91156: getting the remaining hosts for this loop 11044 1726853252.91157: done getting the remaining hosts for this loop 11044 1726853252.91159: getting the next task for host managed_node1 11044 1726853252.91161: done getting next task for host managed_node1 11044 1726853252.91162: ^ task is: TASK: Get stat for interface {{ interface }} 11044 1726853252.91165: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853252.91167: getting variables 11044 1726853252.91168: in VariableManager get_vars() 11044 1726853252.91178: Calling all_inventory to load vars for managed_node1 11044 1726853252.91180: Calling groups_inventory to load vars for managed_node1 11044 1726853252.91182: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853252.91187: Calling all_plugins_play to load vars for managed_node1 11044 1726853252.91189: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853252.91191: Calling groups_plugins_play to load vars for managed_node1 11044 1726853252.91822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853252.93217: done with get_vars() 11044 1726853252.93237: done getting variables 11044 1726853252.93407: variable 'interface' from source: task vars 11044 1726853252.93411: variable 'controller_device' from source: play vars 11044 1726853252.93472: variable 'controller_device' from source: play vars TASK [Get stat for interface deprecated-bond] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:27:32 -0400 (0:00:00.072) 0:00:17.310 ****** 11044 1726853252.93503: entering _queue_task() for managed_node1/stat 11044 1726853252.94352: worker is 1 (out of 1 available) 11044 1726853252.94367: exiting _queue_task() for managed_node1/stat 11044 1726853252.94382: done queuing things up, now waiting for results queue to drain 11044 1726853252.94384: waiting for pending results... 11044 1726853252.94931: running TaskExecutor() for managed_node1/TASK: Get stat for interface deprecated-bond 11044 1726853252.94937: in run() - task 02083763-bbaf-c5a6-f857-000000000242 11044 1726853252.94954: variable 'ansible_search_path' from source: unknown 11044 1726853252.94958: variable 'ansible_search_path' from source: unknown 11044 1726853252.95029: calling self._execute() 11044 1726853252.95086: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853252.95092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853252.95101: variable 'omit' from source: magic vars 11044 1726853252.95449: variable 'ansible_distribution_major_version' from source: facts 11044 1726853252.95464: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853252.95467: variable 'omit' from source: magic vars 11044 1726853252.95575: variable 'omit' from source: magic vars 11044 1726853252.95612: variable 'interface' from source: task vars 11044 1726853252.95615: variable 'controller_device' from source: play vars 11044 1726853252.95677: variable 'controller_device' from source: play vars 11044 1726853252.95691: variable 'omit' from source: magic vars 11044 1726853252.95736: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853252.95769: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853252.95876: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853252.95879: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853252.95883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853252.95886: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853252.95895: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853252.95898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853252.95952: Set connection var ansible_timeout to 10 11044 1726853252.95961: Set connection var ansible_shell_executable to /bin/sh 11044 1726853252.95964: Set connection var ansible_shell_type to sh 11044 1726853252.95968: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853252.95974: Set connection var ansible_connection to ssh 11044 1726853252.95980: Set connection var ansible_pipelining to False 11044 1726853252.96005: variable 'ansible_shell_executable' from source: unknown 11044 1726853252.96009: variable 'ansible_connection' from source: unknown 11044 1726853252.96013: variable 'ansible_module_compression' from source: unknown 11044 1726853252.96015: variable 'ansible_shell_type' from source: unknown 11044 1726853252.96018: variable 'ansible_shell_executable' from source: unknown 11044 1726853252.96020: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853252.96022: variable 'ansible_pipelining' from source: unknown 11044 1726853252.96026: variable 'ansible_timeout' from source: unknown 11044 1726853252.96029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853252.96225: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11044 1726853252.96229: variable 'omit' from source: magic vars 11044 1726853252.96234: starting attempt loop 11044 1726853252.96238: running the handler 11044 1726853252.96254: _low_level_execute_command(): starting 11044 1726853252.96263: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853252.97077: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853252.97083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853252.97092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853252.97153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853252.97157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853252.97159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853252.97441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853252.99176: stdout chunk (state=3): >>>/root <<< 11044 1726853252.99508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853252.99512: stdout chunk (state=3): >>><<< 11044 1726853252.99515: stderr chunk (state=3): >>><<< 11044 1726853252.99519: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853252.99522: _low_level_execute_command(): starting 11044 1726853252.99525: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853252.9941638-11948-69517600501083 `" && echo ansible-tmp-1726853252.9941638-11948-69517600501083="` echo /root/.ansible/tmp/ansible-tmp-1726853252.9941638-11948-69517600501083 `" ) && sleep 0' 11044 1726853253.00048: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853253.00063: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853253.00080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853253.00105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853253.00131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853253.00141: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853253.00191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853253.00248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853253.00273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853253.00287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853253.00358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853253.02258: stdout chunk (state=3): >>>ansible-tmp-1726853252.9941638-11948-69517600501083=/root/.ansible/tmp/ansible-tmp-1726853252.9941638-11948-69517600501083 <<< 11044 1726853253.02423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853253.02437: stderr chunk (state=3): >>><<< 11044 1726853253.02447: stdout chunk (state=3): >>><<< 11044 1726853253.02677: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853252.9941638-11948-69517600501083=/root/.ansible/tmp/ansible-tmp-1726853252.9941638-11948-69517600501083 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853253.02681: variable 'ansible_module_compression' from source: unknown 11044 1726853253.02683: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11044 1726853253.02685: variable 'ansible_facts' from source: unknown 11044 1726853253.02725: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853252.9941638-11948-69517600501083/AnsiballZ_stat.py 11044 1726853253.02931: Sending initial data 11044 1726853253.02934: Sent initial data (152 bytes) 11044 1726853253.03484: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853253.03579: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853253.03607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853253.03625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853253.03647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853253.03723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853253.05340: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 11044 1726853253.05348: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853253.05376: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853253.05433: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpindpt43g /root/.ansible/tmp/ansible-tmp-1726853252.9941638-11948-69517600501083/AnsiballZ_stat.py <<< 11044 1726853253.05436: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853252.9941638-11948-69517600501083/AnsiballZ_stat.py" <<< 11044 1726853253.05473: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpindpt43g" to remote "/root/.ansible/tmp/ansible-tmp-1726853252.9941638-11948-69517600501083/AnsiballZ_stat.py" <<< 11044 1726853253.05479: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853252.9941638-11948-69517600501083/AnsiballZ_stat.py" <<< 11044 1726853253.06000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853253.06053: stderr chunk (state=3): >>><<< 11044 1726853253.06057: stdout chunk (state=3): >>><<< 11044 1726853253.06077: done transferring module to remote 11044 1726853253.06086: _low_level_execute_command(): starting 11044 1726853253.06092: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853252.9941638-11948-69517600501083/ /root/.ansible/tmp/ansible-tmp-1726853252.9941638-11948-69517600501083/AnsiballZ_stat.py && sleep 0' 11044 1726853253.06600: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853253.06604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853253.06655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853253.06660: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853253.06703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853253.08476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853253.08499: stderr chunk (state=3): >>><<< 11044 1726853253.08503: stdout chunk (state=3): >>><<< 11044 1726853253.08522: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853253.08526: _low_level_execute_command(): starting 11044 1726853253.08528: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853252.9941638-11948-69517600501083/AnsiballZ_stat.py && sleep 0' 11044 1726853253.08931: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853253.08966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853253.08970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853253.08976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853253.08979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853253.09020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853253.09024: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853253.09080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853253.24463: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/deprecated-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27271, "dev": 23, "nlink": 1, "atime": 1726853251.8659668, "mtime": 1726853251.8659668, "ctime": 1726853251.8659668, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/deprecated-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11044 1726853253.25792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853253.25819: stderr chunk (state=3): >>><<< 11044 1726853253.25823: stdout chunk (state=3): >>><<< 11044 1726853253.25840: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/deprecated-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27271, "dev": 23, "nlink": 1, "atime": 1726853251.8659668, "mtime": 1726853251.8659668, "ctime": 1726853251.8659668, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/deprecated-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853253.25884: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853252.9941638-11948-69517600501083/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853253.25891: _low_level_execute_command(): starting 11044 1726853253.25900: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853252.9941638-11948-69517600501083/ > /dev/null 2>&1 && sleep 0' 11044 1726853253.26350: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853253.26354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853253.26356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853253.26358: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853253.26361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853253.26417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853253.26422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853253.26425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853253.26464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853253.28306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853253.28333: stderr chunk (state=3): >>><<< 11044 1726853253.28336: stdout chunk (state=3): >>><<< 11044 1726853253.28355: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853253.28362: handler run complete 11044 1726853253.28395: attempt loop complete, returning result 11044 1726853253.28398: _execute() done 11044 1726853253.28401: dumping result to json 11044 1726853253.28405: done dumping result, returning 11044 1726853253.28412: done running TaskExecutor() for managed_node1/TASK: Get stat for interface deprecated-bond [02083763-bbaf-c5a6-f857-000000000242] 11044 1726853253.28419: sending task result for task 02083763-bbaf-c5a6-f857-000000000242 11044 1726853253.28526: done sending task result for task 02083763-bbaf-c5a6-f857-000000000242 11044 1726853253.28529: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726853251.8659668, "block_size": 4096, "blocks": 0, "ctime": 1726853251.8659668, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27271, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "mode": "0777", "mtime": 1726853251.8659668, "nlink": 1, "path": "/sys/class/net/deprecated-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11044 1726853253.28613: no more pending results, returning what we have 11044 1726853253.28616: results queue empty 11044 1726853253.28617: checking for any_errors_fatal 11044 1726853253.28619: done checking for any_errors_fatal 11044 1726853253.28619: checking for max_fail_percentage 11044 1726853253.28622: done checking for max_fail_percentage 11044 1726853253.28623: checking to see if all hosts have failed and the running result is not ok 11044 1726853253.28623: done checking to see if all hosts have failed 11044 1726853253.28624: getting the remaining hosts for this loop 11044 1726853253.28625: done getting the remaining hosts for this loop 11044 1726853253.28629: getting the next task for host managed_node1 11044 1726853253.28636: done getting next task for host managed_node1 11044 1726853253.28640: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11044 1726853253.28643: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853253.28647: getting variables 11044 1726853253.28649: in VariableManager get_vars() 11044 1726853253.28691: Calling all_inventory to load vars for managed_node1 11044 1726853253.28694: Calling groups_inventory to load vars for managed_node1 11044 1726853253.28696: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853253.28706: Calling all_plugins_play to load vars for managed_node1 11044 1726853253.28708: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853253.28710: Calling groups_plugins_play to load vars for managed_node1 11044 1726853253.29942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853253.30834: done with get_vars() 11044 1726853253.30853: done getting variables 11044 1726853253.30898: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853253.30986: variable 'interface' from source: task vars 11044 1726853253.30989: variable 'controller_device' from source: play vars 11044 1726853253.31032: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'deprecated-bond'] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:27:33 -0400 (0:00:00.375) 0:00:17.686 ****** 11044 1726853253.31061: entering _queue_task() for managed_node1/assert 11044 1726853253.31309: worker is 1 (out of 1 available) 11044 1726853253.31323: exiting _queue_task() for managed_node1/assert 11044 1726853253.31337: done queuing things up, now waiting for results queue to drain 11044 1726853253.31339: waiting for pending results... 11044 1726853253.31530: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'deprecated-bond' 11044 1726853253.31609: in run() - task 02083763-bbaf-c5a6-f857-00000000006f 11044 1726853253.31619: variable 'ansible_search_path' from source: unknown 11044 1726853253.31622: variable 'ansible_search_path' from source: unknown 11044 1726853253.31655: calling self._execute() 11044 1726853253.31884: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853253.31887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853253.31891: variable 'omit' from source: magic vars 11044 1726853253.32052: variable 'ansible_distribution_major_version' from source: facts 11044 1726853253.32061: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853253.32067: variable 'omit' from source: magic vars 11044 1726853253.32103: variable 'omit' from source: magic vars 11044 1726853253.32172: variable 'interface' from source: task vars 11044 1726853253.32176: variable 'controller_device' from source: play vars 11044 1726853253.32222: variable 'controller_device' from source: play vars 11044 1726853253.32237: variable 'omit' from source: magic vars 11044 1726853253.32275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853253.32301: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853253.32320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853253.32334: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853253.32351: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853253.32376: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853253.32380: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853253.32382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853253.32454: Set connection var ansible_timeout to 10 11044 1726853253.32462: Set connection var ansible_shell_executable to /bin/sh 11044 1726853253.32465: Set connection var ansible_shell_type to sh 11044 1726853253.32469: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853253.32476: Set connection var ansible_connection to ssh 11044 1726853253.32481: Set connection var ansible_pipelining to False 11044 1726853253.32499: variable 'ansible_shell_executable' from source: unknown 11044 1726853253.32502: variable 'ansible_connection' from source: unknown 11044 1726853253.32505: variable 'ansible_module_compression' from source: unknown 11044 1726853253.32507: variable 'ansible_shell_type' from source: unknown 11044 1726853253.32510: variable 'ansible_shell_executable' from source: unknown 11044 1726853253.32512: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853253.32515: variable 'ansible_pipelining' from source: unknown 11044 1726853253.32517: variable 'ansible_timeout' from source: unknown 11044 1726853253.32520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853253.32623: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853253.32631: variable 'omit' from source: magic vars 11044 1726853253.32636: starting attempt loop 11044 1726853253.32639: running the handler 11044 1726853253.32881: variable 'interface_stat' from source: set_fact 11044 1726853253.32885: Evaluated conditional (interface_stat.stat.exists): True 11044 1726853253.32887: handler run complete 11044 1726853253.32889: attempt loop complete, returning result 11044 1726853253.32890: _execute() done 11044 1726853253.32892: dumping result to json 11044 1726853253.32894: done dumping result, returning 11044 1726853253.32895: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'deprecated-bond' [02083763-bbaf-c5a6-f857-00000000006f] 11044 1726853253.32897: sending task result for task 02083763-bbaf-c5a6-f857-00000000006f 11044 1726853253.32953: done sending task result for task 02083763-bbaf-c5a6-f857-00000000006f 11044 1726853253.32955: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11044 1726853253.33029: no more pending results, returning what we have 11044 1726853253.33031: results queue empty 11044 1726853253.33032: checking for any_errors_fatal 11044 1726853253.33040: done checking for any_errors_fatal 11044 1726853253.33041: checking for max_fail_percentage 11044 1726853253.33042: done checking for max_fail_percentage 11044 1726853253.33043: checking to see if all hosts have failed and the running result is not ok 11044 1726853253.33044: done checking to see if all hosts have failed 11044 1726853253.33045: getting the remaining hosts for this loop 11044 1726853253.33046: done getting the remaining hosts for this loop 11044 1726853253.33049: getting the next task for host managed_node1 11044 1726853253.33055: done getting next task for host managed_node1 11044 1726853253.33057: ^ task is: TASK: Include the task 'assert_profile_present.yml' 11044 1726853253.33059: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853253.33062: getting variables 11044 1726853253.33063: in VariableManager get_vars() 11044 1726853253.33176: Calling all_inventory to load vars for managed_node1 11044 1726853253.33179: Calling groups_inventory to load vars for managed_node1 11044 1726853253.33183: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853253.33194: Calling all_plugins_play to load vars for managed_node1 11044 1726853253.33197: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853253.33200: Calling groups_plugins_play to load vars for managed_node1 11044 1726853253.34063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853253.35007: done with get_vars() 11044 1726853253.35024: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:67 Friday 20 September 2024 13:27:33 -0400 (0:00:00.040) 0:00:17.726 ****** 11044 1726853253.35098: entering _queue_task() for managed_node1/include_tasks 11044 1726853253.35361: worker is 1 (out of 1 available) 11044 1726853253.35377: exiting _queue_task() for managed_node1/include_tasks 11044 1726853253.35390: done queuing things up, now waiting for results queue to drain 11044 1726853253.35391: waiting for pending results... 11044 1726853253.35580: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' 11044 1726853253.35639: in run() - task 02083763-bbaf-c5a6-f857-000000000070 11044 1726853253.35652: variable 'ansible_search_path' from source: unknown 11044 1726853253.35691: variable 'controller_profile' from source: play vars 11044 1726853253.35839: variable 'controller_profile' from source: play vars 11044 1726853253.35852: variable 'port1_profile' from source: play vars 11044 1726853253.35902: variable 'port1_profile' from source: play vars 11044 1726853253.35908: variable 'port2_profile' from source: play vars 11044 1726853253.35957: variable 'port2_profile' from source: play vars 11044 1726853253.35967: variable 'omit' from source: magic vars 11044 1726853253.36067: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853253.36078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853253.36088: variable 'omit' from source: magic vars 11044 1726853253.36254: variable 'ansible_distribution_major_version' from source: facts 11044 1726853253.36262: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853253.36287: variable 'item' from source: unknown 11044 1726853253.36328: variable 'item' from source: unknown 11044 1726853253.36442: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853253.36446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853253.36448: variable 'omit' from source: magic vars 11044 1726853253.36532: variable 'ansible_distribution_major_version' from source: facts 11044 1726853253.36535: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853253.36555: variable 'item' from source: unknown 11044 1726853253.36601: variable 'item' from source: unknown 11044 1726853253.36663: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853253.36675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853253.36683: variable 'omit' from source: magic vars 11044 1726853253.36780: variable 'ansible_distribution_major_version' from source: facts 11044 1726853253.36785: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853253.36805: variable 'item' from source: unknown 11044 1726853253.36846: variable 'item' from source: unknown 11044 1726853253.36907: dumping result to json 11044 1726853253.36910: done dumping result, returning 11044 1726853253.36913: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' [02083763-bbaf-c5a6-f857-000000000070] 11044 1726853253.36915: sending task result for task 02083763-bbaf-c5a6-f857-000000000070 11044 1726853253.36947: done sending task result for task 02083763-bbaf-c5a6-f857-000000000070 11044 1726853253.36950: WORKER PROCESS EXITING 11044 1726853253.36979: no more pending results, returning what we have 11044 1726853253.36983: in VariableManager get_vars() 11044 1726853253.37025: Calling all_inventory to load vars for managed_node1 11044 1726853253.37029: Calling groups_inventory to load vars for managed_node1 11044 1726853253.37031: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853253.37044: Calling all_plugins_play to load vars for managed_node1 11044 1726853253.37047: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853253.37049: Calling groups_plugins_play to load vars for managed_node1 11044 1726853253.37851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853253.38710: done with get_vars() 11044 1726853253.38726: variable 'ansible_search_path' from source: unknown 11044 1726853253.38738: variable 'ansible_search_path' from source: unknown 11044 1726853253.38744: variable 'ansible_search_path' from source: unknown 11044 1726853253.38749: we have included files to process 11044 1726853253.38750: generating all_blocks data 11044 1726853253.38751: done generating all_blocks data 11044 1726853253.38754: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11044 1726853253.38755: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11044 1726853253.38756: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11044 1726853253.38883: in VariableManager get_vars() 11044 1726853253.38898: done with get_vars() 11044 1726853253.39068: done processing included file 11044 1726853253.39070: iterating over new_blocks loaded from include file 11044 1726853253.39072: in VariableManager get_vars() 11044 1726853253.39084: done with get_vars() 11044 1726853253.39085: filtering new block on tags 11044 1726853253.39099: done filtering new block on tags 11044 1726853253.39100: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=bond0) 11044 1726853253.39104: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11044 1726853253.39104: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11044 1726853253.39106: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11044 1726853253.39164: in VariableManager get_vars() 11044 1726853253.39180: done with get_vars() 11044 1726853253.39321: done processing included file 11044 1726853253.39322: iterating over new_blocks loaded from include file 11044 1726853253.39323: in VariableManager get_vars() 11044 1726853253.39333: done with get_vars() 11044 1726853253.39334: filtering new block on tags 11044 1726853253.39345: done filtering new block on tags 11044 1726853253.39346: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=bond0.0) 11044 1726853253.39349: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11044 1726853253.39349: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11044 1726853253.39351: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11044 1726853253.39411: in VariableManager get_vars() 11044 1726853253.39457: done with get_vars() 11044 1726853253.39602: done processing included file 11044 1726853253.39604: iterating over new_blocks loaded from include file 11044 1726853253.39604: in VariableManager get_vars() 11044 1726853253.39615: done with get_vars() 11044 1726853253.39616: filtering new block on tags 11044 1726853253.39627: done filtering new block on tags 11044 1726853253.39628: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=bond0.1) 11044 1726853253.39630: extending task lists for all hosts with included blocks 11044 1726853253.41933: done extending task lists 11044 1726853253.41938: done processing included files 11044 1726853253.41939: results queue empty 11044 1726853253.41939: checking for any_errors_fatal 11044 1726853253.41942: done checking for any_errors_fatal 11044 1726853253.41942: checking for max_fail_percentage 11044 1726853253.41943: done checking for max_fail_percentage 11044 1726853253.41945: checking to see if all hosts have failed and the running result is not ok 11044 1726853253.41946: done checking to see if all hosts have failed 11044 1726853253.41946: getting the remaining hosts for this loop 11044 1726853253.41947: done getting the remaining hosts for this loop 11044 1726853253.41949: getting the next task for host managed_node1 11044 1726853253.41951: done getting next task for host managed_node1 11044 1726853253.41953: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11044 1726853253.41955: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853253.41956: getting variables 11044 1726853253.41957: in VariableManager get_vars() 11044 1726853253.41966: Calling all_inventory to load vars for managed_node1 11044 1726853253.41968: Calling groups_inventory to load vars for managed_node1 11044 1726853253.41969: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853253.41975: Calling all_plugins_play to load vars for managed_node1 11044 1726853253.41977: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853253.41978: Calling groups_plugins_play to load vars for managed_node1 11044 1726853253.45725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853253.46564: done with get_vars() 11044 1726853253.46583: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:27:33 -0400 (0:00:00.115) 0:00:17.842 ****** 11044 1726853253.46635: entering _queue_task() for managed_node1/include_tasks 11044 1726853253.46909: worker is 1 (out of 1 available) 11044 1726853253.46922: exiting _queue_task() for managed_node1/include_tasks 11044 1726853253.46936: done queuing things up, now waiting for results queue to drain 11044 1726853253.46938: waiting for pending results... 11044 1726853253.47116: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 11044 1726853253.47192: in run() - task 02083763-bbaf-c5a6-f857-000000000260 11044 1726853253.47203: variable 'ansible_search_path' from source: unknown 11044 1726853253.47206: variable 'ansible_search_path' from source: unknown 11044 1726853253.47233: calling self._execute() 11044 1726853253.47307: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853253.47312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853253.47321: variable 'omit' from source: magic vars 11044 1726853253.47603: variable 'ansible_distribution_major_version' from source: facts 11044 1726853253.47613: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853253.47619: _execute() done 11044 1726853253.47622: dumping result to json 11044 1726853253.47626: done dumping result, returning 11044 1726853253.47632: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-c5a6-f857-000000000260] 11044 1726853253.47635: sending task result for task 02083763-bbaf-c5a6-f857-000000000260 11044 1726853253.47729: done sending task result for task 02083763-bbaf-c5a6-f857-000000000260 11044 1726853253.47732: WORKER PROCESS EXITING 11044 1726853253.47761: no more pending results, returning what we have 11044 1726853253.47765: in VariableManager get_vars() 11044 1726853253.47814: Calling all_inventory to load vars for managed_node1 11044 1726853253.47817: Calling groups_inventory to load vars for managed_node1 11044 1726853253.47819: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853253.47832: Calling all_plugins_play to load vars for managed_node1 11044 1726853253.47835: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853253.47837: Calling groups_plugins_play to load vars for managed_node1 11044 1726853253.48619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853253.49788: done with get_vars() 11044 1726853253.49806: variable 'ansible_search_path' from source: unknown 11044 1726853253.49807: variable 'ansible_search_path' from source: unknown 11044 1726853253.49843: we have included files to process 11044 1726853253.49847: generating all_blocks data 11044 1726853253.49849: done generating all_blocks data 11044 1726853253.49851: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11044 1726853253.49852: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11044 1726853253.49854: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11044 1726853253.50755: done processing included file 11044 1726853253.50757: iterating over new_blocks loaded from include file 11044 1726853253.50758: in VariableManager get_vars() 11044 1726853253.50773: done with get_vars() 11044 1726853253.50774: filtering new block on tags 11044 1726853253.50789: done filtering new block on tags 11044 1726853253.50791: in VariableManager get_vars() 11044 1726853253.50802: done with get_vars() 11044 1726853253.50803: filtering new block on tags 11044 1726853253.50817: done filtering new block on tags 11044 1726853253.50818: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 11044 1726853253.50823: extending task lists for all hosts with included blocks 11044 1726853253.50966: done extending task lists 11044 1726853253.50967: done processing included files 11044 1726853253.50968: results queue empty 11044 1726853253.50968: checking for any_errors_fatal 11044 1726853253.50972: done checking for any_errors_fatal 11044 1726853253.50973: checking for max_fail_percentage 11044 1726853253.50974: done checking for max_fail_percentage 11044 1726853253.50974: checking to see if all hosts have failed and the running result is not ok 11044 1726853253.50975: done checking to see if all hosts have failed 11044 1726853253.50975: getting the remaining hosts for this loop 11044 1726853253.50976: done getting the remaining hosts for this loop 11044 1726853253.50977: getting the next task for host managed_node1 11044 1726853253.50980: done getting next task for host managed_node1 11044 1726853253.50981: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11044 1726853253.50983: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853253.50985: getting variables 11044 1726853253.50985: in VariableManager get_vars() 11044 1726853253.50993: Calling all_inventory to load vars for managed_node1 11044 1726853253.50995: Calling groups_inventory to load vars for managed_node1 11044 1726853253.50996: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853253.51000: Calling all_plugins_play to load vars for managed_node1 11044 1726853253.51002: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853253.51004: Calling groups_plugins_play to load vars for managed_node1 11044 1726853253.51616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853253.52653: done with get_vars() 11044 1726853253.52677: done getting variables 11044 1726853253.52719: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:27:33 -0400 (0:00:00.061) 0:00:17.903 ****** 11044 1726853253.52754: entering _queue_task() for managed_node1/set_fact 11044 1726853253.53102: worker is 1 (out of 1 available) 11044 1726853253.53114: exiting _queue_task() for managed_node1/set_fact 11044 1726853253.53126: done queuing things up, now waiting for results queue to drain 11044 1726853253.53127: waiting for pending results... 11044 1726853253.53496: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 11044 1726853253.53624: in run() - task 02083763-bbaf-c5a6-f857-0000000003b3 11044 1726853253.53651: variable 'ansible_search_path' from source: unknown 11044 1726853253.53659: variable 'ansible_search_path' from source: unknown 11044 1726853253.53742: calling self._execute() 11044 1726853253.53851: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853253.53862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853253.53876: variable 'omit' from source: magic vars 11044 1726853253.54303: variable 'ansible_distribution_major_version' from source: facts 11044 1726853253.54323: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853253.54450: variable 'omit' from source: magic vars 11044 1726853253.54454: variable 'omit' from source: magic vars 11044 1726853253.54457: variable 'omit' from source: magic vars 11044 1726853253.54487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853253.54531: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853253.54568: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853253.54593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853253.54610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853253.54650: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853253.54659: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853253.54675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853253.54802: Set connection var ansible_timeout to 10 11044 1726853253.54818: Set connection var ansible_shell_executable to /bin/sh 11044 1726853253.54826: Set connection var ansible_shell_type to sh 11044 1726853253.54836: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853253.54849: Set connection var ansible_connection to ssh 11044 1726853253.54861: Set connection var ansible_pipelining to False 11044 1726853253.54898: variable 'ansible_shell_executable' from source: unknown 11044 1726853253.54906: variable 'ansible_connection' from source: unknown 11044 1726853253.54914: variable 'ansible_module_compression' from source: unknown 11044 1726853253.54921: variable 'ansible_shell_type' from source: unknown 11044 1726853253.54994: variable 'ansible_shell_executable' from source: unknown 11044 1726853253.54997: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853253.54999: variable 'ansible_pipelining' from source: unknown 11044 1726853253.55002: variable 'ansible_timeout' from source: unknown 11044 1726853253.55004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853253.55116: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853253.55133: variable 'omit' from source: magic vars 11044 1726853253.55143: starting attempt loop 11044 1726853253.55154: running the handler 11044 1726853253.55178: handler run complete 11044 1726853253.55193: attempt loop complete, returning result 11044 1726853253.55201: _execute() done 11044 1726853253.55212: dumping result to json 11044 1726853253.55319: done dumping result, returning 11044 1726853253.55322: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-c5a6-f857-0000000003b3] 11044 1726853253.55326: sending task result for task 02083763-bbaf-c5a6-f857-0000000003b3 11044 1726853253.55395: done sending task result for task 02083763-bbaf-c5a6-f857-0000000003b3 11044 1726853253.55398: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11044 1726853253.55485: no more pending results, returning what we have 11044 1726853253.55489: results queue empty 11044 1726853253.55490: checking for any_errors_fatal 11044 1726853253.55492: done checking for any_errors_fatal 11044 1726853253.55493: checking for max_fail_percentage 11044 1726853253.55495: done checking for max_fail_percentage 11044 1726853253.55496: checking to see if all hosts have failed and the running result is not ok 11044 1726853253.55496: done checking to see if all hosts have failed 11044 1726853253.55497: getting the remaining hosts for this loop 11044 1726853253.55498: done getting the remaining hosts for this loop 11044 1726853253.55502: getting the next task for host managed_node1 11044 1726853253.55509: done getting next task for host managed_node1 11044 1726853253.55511: ^ task is: TASK: Stat profile file 11044 1726853253.55516: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853253.55520: getting variables 11044 1726853253.55521: in VariableManager get_vars() 11044 1726853253.55577: Calling all_inventory to load vars for managed_node1 11044 1726853253.55581: Calling groups_inventory to load vars for managed_node1 11044 1726853253.55584: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853253.55596: Calling all_plugins_play to load vars for managed_node1 11044 1726853253.55600: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853253.55603: Calling groups_plugins_play to load vars for managed_node1 11044 1726853253.57598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853253.59510: done with get_vars() 11044 1726853253.59531: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:27:33 -0400 (0:00:00.068) 0:00:17.971 ****** 11044 1726853253.59603: entering _queue_task() for managed_node1/stat 11044 1726853253.59851: worker is 1 (out of 1 available) 11044 1726853253.59863: exiting _queue_task() for managed_node1/stat 11044 1726853253.59877: done queuing things up, now waiting for results queue to drain 11044 1726853253.59879: waiting for pending results... 11044 1726853253.60112: running TaskExecutor() for managed_node1/TASK: Stat profile file 11044 1726853253.60205: in run() - task 02083763-bbaf-c5a6-f857-0000000003b4 11044 1726853253.60212: variable 'ansible_search_path' from source: unknown 11044 1726853253.60216: variable 'ansible_search_path' from source: unknown 11044 1726853253.60279: calling self._execute() 11044 1726853253.60353: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853253.60478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853253.60482: variable 'omit' from source: magic vars 11044 1726853253.60829: variable 'ansible_distribution_major_version' from source: facts 11044 1726853253.60834: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853253.60837: variable 'omit' from source: magic vars 11044 1726853253.60840: variable 'omit' from source: magic vars 11044 1726853253.61170: variable 'profile' from source: include params 11044 1726853253.61176: variable 'item' from source: include params 11044 1726853253.61238: variable 'item' from source: include params 11044 1726853253.61259: variable 'omit' from source: magic vars 11044 1726853253.61303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853253.61338: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853253.61378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853253.61497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853253.61576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853253.61580: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853253.61582: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853253.61584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853253.61874: Set connection var ansible_timeout to 10 11044 1726853253.61878: Set connection var ansible_shell_executable to /bin/sh 11044 1726853253.61881: Set connection var ansible_shell_type to sh 11044 1726853253.61883: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853253.61886: Set connection var ansible_connection to ssh 11044 1726853253.61888: Set connection var ansible_pipelining to False 11044 1726853253.61891: variable 'ansible_shell_executable' from source: unknown 11044 1726853253.61893: variable 'ansible_connection' from source: unknown 11044 1726853253.61896: variable 'ansible_module_compression' from source: unknown 11044 1726853253.61898: variable 'ansible_shell_type' from source: unknown 11044 1726853253.61900: variable 'ansible_shell_executable' from source: unknown 11044 1726853253.61902: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853253.61903: variable 'ansible_pipelining' from source: unknown 11044 1726853253.61906: variable 'ansible_timeout' from source: unknown 11044 1726853253.61907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853253.62276: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11044 1726853253.62280: variable 'omit' from source: magic vars 11044 1726853253.62283: starting attempt loop 11044 1726853253.62286: running the handler 11044 1726853253.62288: _low_level_execute_command(): starting 11044 1726853253.62290: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853253.63092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853253.63110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853253.63126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853253.63205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853253.65305: stdout chunk (state=3): >>>/root <<< 11044 1726853253.65401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853253.65406: stdout chunk (state=3): >>><<< 11044 1726853253.65414: stderr chunk (state=3): >>><<< 11044 1726853253.65446: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853253.65465: _low_level_execute_command(): starting 11044 1726853253.65475: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853253.6545012-11980-235046469936267 `" && echo ansible-tmp-1726853253.6545012-11980-235046469936267="` echo /root/.ansible/tmp/ansible-tmp-1726853253.6545012-11980-235046469936267 `" ) && sleep 0' 11044 1726853253.66678: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853253.66750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853253.66827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853253.66946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853253.67287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853253.69058: stdout chunk (state=3): >>>ansible-tmp-1726853253.6545012-11980-235046469936267=/root/.ansible/tmp/ansible-tmp-1726853253.6545012-11980-235046469936267 <<< 11044 1726853253.69342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853253.69351: stderr chunk (state=3): >>><<< 11044 1726853253.69354: stdout chunk (state=3): >>><<< 11044 1726853253.69385: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853253.6545012-11980-235046469936267=/root/.ansible/tmp/ansible-tmp-1726853253.6545012-11980-235046469936267 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853253.69461: variable 'ansible_module_compression' from source: unknown 11044 1726853253.69519: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11044 1726853253.69997: variable 'ansible_facts' from source: unknown 11044 1726853253.70079: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853253.6545012-11980-235046469936267/AnsiballZ_stat.py 11044 1726853253.70396: Sending initial data 11044 1726853253.70399: Sent initial data (153 bytes) 11044 1726853253.71553: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853253.71890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853253.71908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853253.71926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853253.72000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853253.73696: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11044 1726853253.73714: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853253.73868: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853253.74002: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpvaik1v11 /root/.ansible/tmp/ansible-tmp-1726853253.6545012-11980-235046469936267/AnsiballZ_stat.py <<< 11044 1726853253.74069: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853253.6545012-11980-235046469936267/AnsiballZ_stat.py" <<< 11044 1726853253.74103: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpvaik1v11" to remote "/root/.ansible/tmp/ansible-tmp-1726853253.6545012-11980-235046469936267/AnsiballZ_stat.py" <<< 11044 1726853253.74116: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853253.6545012-11980-235046469936267/AnsiballZ_stat.py" <<< 11044 1726853253.75367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853253.75404: stderr chunk (state=3): >>><<< 11044 1726853253.75407: stdout chunk (state=3): >>><<< 11044 1726853253.75782: done transferring module to remote 11044 1726853253.75785: _low_level_execute_command(): starting 11044 1726853253.75787: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853253.6545012-11980-235046469936267/ /root/.ansible/tmp/ansible-tmp-1726853253.6545012-11980-235046469936267/AnsiballZ_stat.py && sleep 0' 11044 1726853253.76873: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853253.77041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853253.77158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853253.77198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853253.77289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853253.79054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853253.79063: stdout chunk (state=3): >>><<< 11044 1726853253.79079: stderr chunk (state=3): >>><<< 11044 1726853253.79273: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853253.79277: _low_level_execute_command(): starting 11044 1726853253.79280: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853253.6545012-11980-235046469936267/AnsiballZ_stat.py && sleep 0' 11044 1726853253.80311: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853253.80389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853253.80547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853253.80696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853253.80861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853253.96517: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11044 1726853253.97831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853253.97855: stderr chunk (state=3): >>><<< 11044 1726853253.97858: stdout chunk (state=3): >>><<< 11044 1726853253.97876: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853253.97899: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853253.6545012-11980-235046469936267/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853253.97907: _low_level_execute_command(): starting 11044 1726853253.97912: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853253.6545012-11980-235046469936267/ > /dev/null 2>&1 && sleep 0' 11044 1726853253.98331: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853253.98363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853253.98368: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853253.98370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853253.98383: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853253.98386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853253.98388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853253.98429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853253.98432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853253.98436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853253.98477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853254.00578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853254.00581: stdout chunk (state=3): >>><<< 11044 1726853254.00583: stderr chunk (state=3): >>><<< 11044 1726853254.00590: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853254.00592: handler run complete 11044 1726853254.00593: attempt loop complete, returning result 11044 1726853254.00595: _execute() done 11044 1726853254.00596: dumping result to json 11044 1726853254.00598: done dumping result, returning 11044 1726853254.00599: done running TaskExecutor() for managed_node1/TASK: Stat profile file [02083763-bbaf-c5a6-f857-0000000003b4] 11044 1726853254.00601: sending task result for task 02083763-bbaf-c5a6-f857-0000000003b4 11044 1726853254.00669: done sending task result for task 02083763-bbaf-c5a6-f857-0000000003b4 11044 1726853254.00673: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 11044 1726853254.00728: no more pending results, returning what we have 11044 1726853254.00731: results queue empty 11044 1726853254.00731: checking for any_errors_fatal 11044 1726853254.00736: done checking for any_errors_fatal 11044 1726853254.00736: checking for max_fail_percentage 11044 1726853254.00738: done checking for max_fail_percentage 11044 1726853254.00739: checking to see if all hosts have failed and the running result is not ok 11044 1726853254.00740: done checking to see if all hosts have failed 11044 1726853254.00740: getting the remaining hosts for this loop 11044 1726853254.00741: done getting the remaining hosts for this loop 11044 1726853254.00747: getting the next task for host managed_node1 11044 1726853254.00752: done getting next task for host managed_node1 11044 1726853254.00754: ^ task is: TASK: Set NM profile exist flag based on the profile files 11044 1726853254.00758: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853254.00762: getting variables 11044 1726853254.00763: in VariableManager get_vars() 11044 1726853254.00805: Calling all_inventory to load vars for managed_node1 11044 1726853254.00808: Calling groups_inventory to load vars for managed_node1 11044 1726853254.00810: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853254.00819: Calling all_plugins_play to load vars for managed_node1 11044 1726853254.00821: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853254.00823: Calling groups_plugins_play to load vars for managed_node1 11044 1726853254.02260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853254.03979: done with get_vars() 11044 1726853254.04007: done getting variables 11044 1726853254.04078: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:27:34 -0400 (0:00:00.445) 0:00:18.416 ****** 11044 1726853254.04111: entering _queue_task() for managed_node1/set_fact 11044 1726853254.04698: worker is 1 (out of 1 available) 11044 1726853254.04710: exiting _queue_task() for managed_node1/set_fact 11044 1726853254.04721: done queuing things up, now waiting for results queue to drain 11044 1726853254.04723: waiting for pending results... 11044 1726853254.04892: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 11044 1726853254.05063: in run() - task 02083763-bbaf-c5a6-f857-0000000003b5 11044 1726853254.05068: variable 'ansible_search_path' from source: unknown 11044 1726853254.05070: variable 'ansible_search_path' from source: unknown 11044 1726853254.05075: calling self._execute() 11044 1726853254.05169: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.05186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.05207: variable 'omit' from source: magic vars 11044 1726853254.05643: variable 'ansible_distribution_major_version' from source: facts 11044 1726853254.05666: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853254.05807: variable 'profile_stat' from source: set_fact 11044 1726853254.05836: Evaluated conditional (profile_stat.stat.exists): False 11044 1726853254.05854: when evaluation is False, skipping this task 11044 1726853254.05857: _execute() done 11044 1726853254.05935: dumping result to json 11044 1726853254.05939: done dumping result, returning 11044 1726853254.05941: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-c5a6-f857-0000000003b5] 11044 1726853254.05946: sending task result for task 02083763-bbaf-c5a6-f857-0000000003b5 11044 1726853254.06023: done sending task result for task 02083763-bbaf-c5a6-f857-0000000003b5 11044 1726853254.06026: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11044 1726853254.06166: no more pending results, returning what we have 11044 1726853254.06169: results queue empty 11044 1726853254.06172: checking for any_errors_fatal 11044 1726853254.06190: done checking for any_errors_fatal 11044 1726853254.06191: checking for max_fail_percentage 11044 1726853254.06193: done checking for max_fail_percentage 11044 1726853254.06194: checking to see if all hosts have failed and the running result is not ok 11044 1726853254.06195: done checking to see if all hosts have failed 11044 1726853254.06195: getting the remaining hosts for this loop 11044 1726853254.06197: done getting the remaining hosts for this loop 11044 1726853254.06200: getting the next task for host managed_node1 11044 1726853254.06207: done getting next task for host managed_node1 11044 1726853254.06210: ^ task is: TASK: Get NM profile info 11044 1726853254.06214: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853254.06219: getting variables 11044 1726853254.06222: in VariableManager get_vars() 11044 1726853254.06386: Calling all_inventory to load vars for managed_node1 11044 1726853254.06389: Calling groups_inventory to load vars for managed_node1 11044 1726853254.06392: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853254.06403: Calling all_plugins_play to load vars for managed_node1 11044 1726853254.06406: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853254.06408: Calling groups_plugins_play to load vars for managed_node1 11044 1726853254.08119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853254.09806: done with get_vars() 11044 1726853254.09842: done getting variables 11044 1726853254.09907: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:27:34 -0400 (0:00:00.058) 0:00:18.475 ****** 11044 1726853254.09948: entering _queue_task() for managed_node1/shell 11044 1726853254.10332: worker is 1 (out of 1 available) 11044 1726853254.10348: exiting _queue_task() for managed_node1/shell 11044 1726853254.10360: done queuing things up, now waiting for results queue to drain 11044 1726853254.10361: waiting for pending results... 11044 1726853254.10797: running TaskExecutor() for managed_node1/TASK: Get NM profile info 11044 1726853254.10810: in run() - task 02083763-bbaf-c5a6-f857-0000000003b6 11044 1726853254.10834: variable 'ansible_search_path' from source: unknown 11044 1726853254.10843: variable 'ansible_search_path' from source: unknown 11044 1726853254.10892: calling self._execute() 11044 1726853254.11026: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.11126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.11131: variable 'omit' from source: magic vars 11044 1726853254.11480: variable 'ansible_distribution_major_version' from source: facts 11044 1726853254.11499: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853254.11510: variable 'omit' from source: magic vars 11044 1726853254.11575: variable 'omit' from source: magic vars 11044 1726853254.11690: variable 'profile' from source: include params 11044 1726853254.11699: variable 'item' from source: include params 11044 1726853254.11782: variable 'item' from source: include params 11044 1726853254.11799: variable 'omit' from source: magic vars 11044 1726853254.11876: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853254.11901: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853254.11925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853254.11960: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853254.11980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853254.12089: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853254.12091: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.12094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.12181: Set connection var ansible_timeout to 10 11044 1726853254.12195: Set connection var ansible_shell_executable to /bin/sh 11044 1726853254.12203: Set connection var ansible_shell_type to sh 11044 1726853254.12225: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853254.12243: Set connection var ansible_connection to ssh 11044 1726853254.12249: Set connection var ansible_pipelining to False 11044 1726853254.12275: variable 'ansible_shell_executable' from source: unknown 11044 1726853254.12278: variable 'ansible_connection' from source: unknown 11044 1726853254.12281: variable 'ansible_module_compression' from source: unknown 11044 1726853254.12284: variable 'ansible_shell_type' from source: unknown 11044 1726853254.12286: variable 'ansible_shell_executable' from source: unknown 11044 1726853254.12289: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.12291: variable 'ansible_pipelining' from source: unknown 11044 1726853254.12294: variable 'ansible_timeout' from source: unknown 11044 1726853254.12296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.12664: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853254.12668: variable 'omit' from source: magic vars 11044 1726853254.12672: starting attempt loop 11044 1726853254.12675: running the handler 11044 1726853254.12677: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853254.12680: _low_level_execute_command(): starting 11044 1726853254.12682: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853254.13239: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853254.13275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853254.13278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853254.13282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853254.13349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853254.13353: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853254.13355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853254.13398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853254.15087: stdout chunk (state=3): >>>/root <<< 11044 1726853254.15235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853254.15241: stdout chunk (state=3): >>><<< 11044 1726853254.15250: stderr chunk (state=3): >>><<< 11044 1726853254.15277: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853254.15298: _low_level_execute_command(): starting 11044 1726853254.15306: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853254.1527665-12010-202972839819319 `" && echo ansible-tmp-1726853254.1527665-12010-202972839819319="` echo /root/.ansible/tmp/ansible-tmp-1726853254.1527665-12010-202972839819319 `" ) && sleep 0' 11044 1726853254.15902: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853254.15917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853254.15931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853254.15953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853254.15984: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853254.16068: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853254.16096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853254.16174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853254.18072: stdout chunk (state=3): >>>ansible-tmp-1726853254.1527665-12010-202972839819319=/root/.ansible/tmp/ansible-tmp-1726853254.1527665-12010-202972839819319 <<< 11044 1726853254.18175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853254.18203: stderr chunk (state=3): >>><<< 11044 1726853254.18206: stdout chunk (state=3): >>><<< 11044 1726853254.18223: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853254.1527665-12010-202972839819319=/root/.ansible/tmp/ansible-tmp-1726853254.1527665-12010-202972839819319 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853254.18255: variable 'ansible_module_compression' from source: unknown 11044 1726853254.18299: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11044 1726853254.18334: variable 'ansible_facts' from source: unknown 11044 1726853254.18387: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853254.1527665-12010-202972839819319/AnsiballZ_command.py 11044 1726853254.18493: Sending initial data 11044 1726853254.18496: Sent initial data (156 bytes) 11044 1726853254.19161: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853254.19212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853254.19256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853254.20799: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853254.20836: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853254.20878: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmp8c1x0tx9 /root/.ansible/tmp/ansible-tmp-1726853254.1527665-12010-202972839819319/AnsiballZ_command.py <<< 11044 1726853254.20885: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853254.1527665-12010-202972839819319/AnsiballZ_command.py" <<< 11044 1726853254.20918: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmp8c1x0tx9" to remote "/root/.ansible/tmp/ansible-tmp-1726853254.1527665-12010-202972839819319/AnsiballZ_command.py" <<< 11044 1726853254.20922: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853254.1527665-12010-202972839819319/AnsiballZ_command.py" <<< 11044 1726853254.21438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853254.21485: stderr chunk (state=3): >>><<< 11044 1726853254.21488: stdout chunk (state=3): >>><<< 11044 1726853254.21533: done transferring module to remote 11044 1726853254.21542: _low_level_execute_command(): starting 11044 1726853254.21548: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853254.1527665-12010-202972839819319/ /root/.ansible/tmp/ansible-tmp-1726853254.1527665-12010-202972839819319/AnsiballZ_command.py && sleep 0' 11044 1726853254.21969: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853254.22004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853254.22010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853254.22012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853254.22018: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853254.22020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853254.22022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853254.22064: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853254.22068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853254.22110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853254.23829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853254.23858: stderr chunk (state=3): >>><<< 11044 1726853254.23861: stdout chunk (state=3): >>><<< 11044 1726853254.23881: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853254.23884: _low_level_execute_command(): starting 11044 1726853254.23889: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853254.1527665-12010-202972839819319/AnsiballZ_command.py && sleep 0' 11044 1726853254.24333: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853254.24336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853254.24339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853254.24341: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853254.24343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853254.24398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853254.24407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853254.24410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853254.24445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853254.41982: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 13:27:34.392760", "end": "2024-09-20 13:27:34.417123", "delta": "0:00:00.024363", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11044 1726853254.43342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853254.43360: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 11044 1726853254.43424: stderr chunk (state=3): >>><<< 11044 1726853254.43442: stdout chunk (state=3): >>><<< 11044 1726853254.43474: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 13:27:34.392760", "end": "2024-09-20 13:27:34.417123", "delta": "0:00:00.024363", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853254.43523: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853254.1527665-12010-202972839819319/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853254.43547: _low_level_execute_command(): starting 11044 1726853254.43557: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853254.1527665-12010-202972839819319/ > /dev/null 2>&1 && sleep 0' 11044 1726853254.44235: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853254.44252: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853254.44265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853254.44285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853254.44312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853254.44403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853254.44428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853254.44524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853254.46332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853254.46356: stdout chunk (state=3): >>><<< 11044 1726853254.46359: stderr chunk (state=3): >>><<< 11044 1726853254.46379: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853254.46577: handler run complete 11044 1726853254.46581: Evaluated conditional (False): False 11044 1726853254.46584: attempt loop complete, returning result 11044 1726853254.46587: _execute() done 11044 1726853254.46590: dumping result to json 11044 1726853254.46592: done dumping result, returning 11044 1726853254.46595: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [02083763-bbaf-c5a6-f857-0000000003b6] 11044 1726853254.46598: sending task result for task 02083763-bbaf-c5a6-f857-0000000003b6 11044 1726853254.46672: done sending task result for task 02083763-bbaf-c5a6-f857-0000000003b6 ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.024363", "end": "2024-09-20 13:27:34.417123", "rc": 0, "start": "2024-09-20 13:27:34.392760" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 11044 1726853254.46742: no more pending results, returning what we have 11044 1726853254.46748: results queue empty 11044 1726853254.46749: checking for any_errors_fatal 11044 1726853254.46756: done checking for any_errors_fatal 11044 1726853254.46757: checking for max_fail_percentage 11044 1726853254.46758: done checking for max_fail_percentage 11044 1726853254.46759: checking to see if all hosts have failed and the running result is not ok 11044 1726853254.46760: done checking to see if all hosts have failed 11044 1726853254.46760: getting the remaining hosts for this loop 11044 1726853254.46762: done getting the remaining hosts for this loop 11044 1726853254.46764: getting the next task for host managed_node1 11044 1726853254.46770: done getting next task for host managed_node1 11044 1726853254.46775: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11044 1726853254.46779: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853254.46782: getting variables 11044 1726853254.46784: in VariableManager get_vars() 11044 1726853254.46819: Calling all_inventory to load vars for managed_node1 11044 1726853254.46822: Calling groups_inventory to load vars for managed_node1 11044 1726853254.46824: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853254.46835: Calling all_plugins_play to load vars for managed_node1 11044 1726853254.46837: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853254.46841: Calling groups_plugins_play to load vars for managed_node1 11044 1726853254.47361: WORKER PROCESS EXITING 11044 1726853254.48162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853254.49801: done with get_vars() 11044 1726853254.49823: done getting variables 11044 1726853254.49883: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:27:34 -0400 (0:00:00.399) 0:00:18.874 ****** 11044 1726853254.49913: entering _queue_task() for managed_node1/set_fact 11044 1726853254.50254: worker is 1 (out of 1 available) 11044 1726853254.50269: exiting _queue_task() for managed_node1/set_fact 11044 1726853254.50384: done queuing things up, now waiting for results queue to drain 11044 1726853254.50386: waiting for pending results... 11044 1726853254.50566: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11044 1726853254.50687: in run() - task 02083763-bbaf-c5a6-f857-0000000003b7 11044 1726853254.50706: variable 'ansible_search_path' from source: unknown 11044 1726853254.50714: variable 'ansible_search_path' from source: unknown 11044 1726853254.50755: calling self._execute() 11044 1726853254.50866: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.50884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.50898: variable 'omit' from source: magic vars 11044 1726853254.51298: variable 'ansible_distribution_major_version' from source: facts 11044 1726853254.51476: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853254.51480: variable 'nm_profile_exists' from source: set_fact 11044 1726853254.51483: Evaluated conditional (nm_profile_exists.rc == 0): True 11044 1726853254.51485: variable 'omit' from source: magic vars 11044 1726853254.51536: variable 'omit' from source: magic vars 11044 1726853254.51573: variable 'omit' from source: magic vars 11044 1726853254.51624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853254.51666: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853254.51694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853254.51722: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853254.51741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853254.51777: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853254.51787: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.51795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.51905: Set connection var ansible_timeout to 10 11044 1726853254.51923: Set connection var ansible_shell_executable to /bin/sh 11044 1726853254.51934: Set connection var ansible_shell_type to sh 11044 1726853254.51943: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853254.51952: Set connection var ansible_connection to ssh 11044 1726853254.51963: Set connection var ansible_pipelining to False 11044 1726853254.51992: variable 'ansible_shell_executable' from source: unknown 11044 1726853254.52001: variable 'ansible_connection' from source: unknown 11044 1726853254.52009: variable 'ansible_module_compression' from source: unknown 11044 1726853254.52016: variable 'ansible_shell_type' from source: unknown 11044 1726853254.52037: variable 'ansible_shell_executable' from source: unknown 11044 1726853254.52040: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.52043: variable 'ansible_pipelining' from source: unknown 11044 1726853254.52045: variable 'ansible_timeout' from source: unknown 11044 1726853254.52077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.52205: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853254.52223: variable 'omit' from source: magic vars 11044 1726853254.52233: starting attempt loop 11044 1726853254.52255: running the handler 11044 1726853254.52260: handler run complete 11044 1726853254.52277: attempt loop complete, returning result 11044 1726853254.52364: _execute() done 11044 1726853254.52368: dumping result to json 11044 1726853254.52370: done dumping result, returning 11044 1726853254.52375: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-c5a6-f857-0000000003b7] 11044 1726853254.52377: sending task result for task 02083763-bbaf-c5a6-f857-0000000003b7 11044 1726853254.52446: done sending task result for task 02083763-bbaf-c5a6-f857-0000000003b7 11044 1726853254.52450: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11044 1726853254.52526: no more pending results, returning what we have 11044 1726853254.52529: results queue empty 11044 1726853254.52530: checking for any_errors_fatal 11044 1726853254.52540: done checking for any_errors_fatal 11044 1726853254.52541: checking for max_fail_percentage 11044 1726853254.52542: done checking for max_fail_percentage 11044 1726853254.52544: checking to see if all hosts have failed and the running result is not ok 11044 1726853254.52545: done checking to see if all hosts have failed 11044 1726853254.52545: getting the remaining hosts for this loop 11044 1726853254.52547: done getting the remaining hosts for this loop 11044 1726853254.52550: getting the next task for host managed_node1 11044 1726853254.52559: done getting next task for host managed_node1 11044 1726853254.52562: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11044 1726853254.52567: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853254.52573: getting variables 11044 1726853254.52574: in VariableManager get_vars() 11044 1726853254.52618: Calling all_inventory to load vars for managed_node1 11044 1726853254.52623: Calling groups_inventory to load vars for managed_node1 11044 1726853254.52626: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853254.52638: Calling all_plugins_play to load vars for managed_node1 11044 1726853254.52641: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853254.52645: Calling groups_plugins_play to load vars for managed_node1 11044 1726853254.54160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853254.55627: done with get_vars() 11044 1726853254.55651: done getting variables 11044 1726853254.55713: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853254.55898: variable 'profile' from source: include params 11044 1726853254.55903: variable 'item' from source: include params 11044 1726853254.55969: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:27:34 -0400 (0:00:00.060) 0:00:18.935 ****** 11044 1726853254.56011: entering _queue_task() for managed_node1/command 11044 1726853254.56417: worker is 1 (out of 1 available) 11044 1726853254.56431: exiting _queue_task() for managed_node1/command 11044 1726853254.56445: done queuing things up, now waiting for results queue to drain 11044 1726853254.56446: waiting for pending results... 11044 1726853254.56659: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0 11044 1726853254.56737: in run() - task 02083763-bbaf-c5a6-f857-0000000003b9 11044 1726853254.56751: variable 'ansible_search_path' from source: unknown 11044 1726853254.56756: variable 'ansible_search_path' from source: unknown 11044 1726853254.56789: calling self._execute() 11044 1726853254.56865: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.56870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.56881: variable 'omit' from source: magic vars 11044 1726853254.57155: variable 'ansible_distribution_major_version' from source: facts 11044 1726853254.57165: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853254.57247: variable 'profile_stat' from source: set_fact 11044 1726853254.57261: Evaluated conditional (profile_stat.stat.exists): False 11044 1726853254.57264: when evaluation is False, skipping this task 11044 1726853254.57268: _execute() done 11044 1726853254.57272: dumping result to json 11044 1726853254.57275: done dumping result, returning 11044 1726853254.57282: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0 [02083763-bbaf-c5a6-f857-0000000003b9] 11044 1726853254.57285: sending task result for task 02083763-bbaf-c5a6-f857-0000000003b9 11044 1726853254.57369: done sending task result for task 02083763-bbaf-c5a6-f857-0000000003b9 11044 1726853254.57374: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11044 1726853254.57425: no more pending results, returning what we have 11044 1726853254.57429: results queue empty 11044 1726853254.57430: checking for any_errors_fatal 11044 1726853254.57438: done checking for any_errors_fatal 11044 1726853254.57438: checking for max_fail_percentage 11044 1726853254.57440: done checking for max_fail_percentage 11044 1726853254.57441: checking to see if all hosts have failed and the running result is not ok 11044 1726853254.57441: done checking to see if all hosts have failed 11044 1726853254.57442: getting the remaining hosts for this loop 11044 1726853254.57443: done getting the remaining hosts for this loop 11044 1726853254.57446: getting the next task for host managed_node1 11044 1726853254.57453: done getting next task for host managed_node1 11044 1726853254.57456: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11044 1726853254.57460: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853254.57465: getting variables 11044 1726853254.57466: in VariableManager get_vars() 11044 1726853254.57509: Calling all_inventory to load vars for managed_node1 11044 1726853254.57511: Calling groups_inventory to load vars for managed_node1 11044 1726853254.57513: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853254.57524: Calling all_plugins_play to load vars for managed_node1 11044 1726853254.57526: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853254.57529: Calling groups_plugins_play to load vars for managed_node1 11044 1726853254.58540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853254.59736: done with get_vars() 11044 1726853254.59755: done getting variables 11044 1726853254.59801: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853254.59883: variable 'profile' from source: include params 11044 1726853254.59886: variable 'item' from source: include params 11044 1726853254.59927: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:27:34 -0400 (0:00:00.039) 0:00:18.975 ****** 11044 1726853254.59951: entering _queue_task() for managed_node1/set_fact 11044 1726853254.60203: worker is 1 (out of 1 available) 11044 1726853254.60215: exiting _queue_task() for managed_node1/set_fact 11044 1726853254.60230: done queuing things up, now waiting for results queue to drain 11044 1726853254.60232: waiting for pending results... 11044 1726853254.60407: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 11044 1726853254.60480: in run() - task 02083763-bbaf-c5a6-f857-0000000003ba 11044 1726853254.60492: variable 'ansible_search_path' from source: unknown 11044 1726853254.60496: variable 'ansible_search_path' from source: unknown 11044 1726853254.60524: calling self._execute() 11044 1726853254.60604: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.60608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.60617: variable 'omit' from source: magic vars 11044 1726853254.60884: variable 'ansible_distribution_major_version' from source: facts 11044 1726853254.60897: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853254.60977: variable 'profile_stat' from source: set_fact 11044 1726853254.60990: Evaluated conditional (profile_stat.stat.exists): False 11044 1726853254.60993: when evaluation is False, skipping this task 11044 1726853254.60995: _execute() done 11044 1726853254.60999: dumping result to json 11044 1726853254.61002: done dumping result, returning 11044 1726853254.61005: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 [02083763-bbaf-c5a6-f857-0000000003ba] 11044 1726853254.61009: sending task result for task 02083763-bbaf-c5a6-f857-0000000003ba 11044 1726853254.61093: done sending task result for task 02083763-bbaf-c5a6-f857-0000000003ba 11044 1726853254.61095: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11044 1726853254.61164: no more pending results, returning what we have 11044 1726853254.61168: results queue empty 11044 1726853254.61169: checking for any_errors_fatal 11044 1726853254.61178: done checking for any_errors_fatal 11044 1726853254.61179: checking for max_fail_percentage 11044 1726853254.61180: done checking for max_fail_percentage 11044 1726853254.61181: checking to see if all hosts have failed and the running result is not ok 11044 1726853254.61182: done checking to see if all hosts have failed 11044 1726853254.61182: getting the remaining hosts for this loop 11044 1726853254.61184: done getting the remaining hosts for this loop 11044 1726853254.61187: getting the next task for host managed_node1 11044 1726853254.61193: done getting next task for host managed_node1 11044 1726853254.61195: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11044 1726853254.61199: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853254.61202: getting variables 11044 1726853254.61204: in VariableManager get_vars() 11044 1726853254.61240: Calling all_inventory to load vars for managed_node1 11044 1726853254.61242: Calling groups_inventory to load vars for managed_node1 11044 1726853254.61244: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853254.61253: Calling all_plugins_play to load vars for managed_node1 11044 1726853254.61256: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853254.61258: Calling groups_plugins_play to load vars for managed_node1 11044 1726853254.62416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853254.63950: done with get_vars() 11044 1726853254.63984: done getting variables 11044 1726853254.64045: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853254.64156: variable 'profile' from source: include params 11044 1726853254.64160: variable 'item' from source: include params 11044 1726853254.64218: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:27:34 -0400 (0:00:00.042) 0:00:19.018 ****** 11044 1726853254.64252: entering _queue_task() for managed_node1/command 11044 1726853254.64609: worker is 1 (out of 1 available) 11044 1726853254.64624: exiting _queue_task() for managed_node1/command 11044 1726853254.64638: done queuing things up, now waiting for results queue to drain 11044 1726853254.64639: waiting for pending results... 11044 1726853254.65089: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0 11044 1726853254.65095: in run() - task 02083763-bbaf-c5a6-f857-0000000003bb 11044 1726853254.65099: variable 'ansible_search_path' from source: unknown 11044 1726853254.65101: variable 'ansible_search_path' from source: unknown 11044 1726853254.65104: calling self._execute() 11044 1726853254.65202: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.65217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.65234: variable 'omit' from source: magic vars 11044 1726853254.65610: variable 'ansible_distribution_major_version' from source: facts 11044 1726853254.65629: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853254.65758: variable 'profile_stat' from source: set_fact 11044 1726853254.65782: Evaluated conditional (profile_stat.stat.exists): False 11044 1726853254.65789: when evaluation is False, skipping this task 11044 1726853254.65796: _execute() done 11044 1726853254.65804: dumping result to json 11044 1726853254.65811: done dumping result, returning 11044 1726853254.65822: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0 [02083763-bbaf-c5a6-f857-0000000003bb] 11044 1726853254.65832: sending task result for task 02083763-bbaf-c5a6-f857-0000000003bb skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11044 1726853254.66084: no more pending results, returning what we have 11044 1726853254.66087: results queue empty 11044 1726853254.66088: checking for any_errors_fatal 11044 1726853254.66096: done checking for any_errors_fatal 11044 1726853254.66097: checking for max_fail_percentage 11044 1726853254.66098: done checking for max_fail_percentage 11044 1726853254.66099: checking to see if all hosts have failed and the running result is not ok 11044 1726853254.66100: done checking to see if all hosts have failed 11044 1726853254.66101: getting the remaining hosts for this loop 11044 1726853254.66102: done getting the remaining hosts for this loop 11044 1726853254.66106: getting the next task for host managed_node1 11044 1726853254.66112: done getting next task for host managed_node1 11044 1726853254.66115: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11044 1726853254.66120: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853254.66125: getting variables 11044 1726853254.66126: in VariableManager get_vars() 11044 1726853254.66169: Calling all_inventory to load vars for managed_node1 11044 1726853254.66173: Calling groups_inventory to load vars for managed_node1 11044 1726853254.66176: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853254.66190: Calling all_plugins_play to load vars for managed_node1 11044 1726853254.66194: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853254.66196: Calling groups_plugins_play to load vars for managed_node1 11044 1726853254.66784: done sending task result for task 02083763-bbaf-c5a6-f857-0000000003bb 11044 1726853254.66788: WORKER PROCESS EXITING 11044 1726853254.67831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853254.69344: done with get_vars() 11044 1726853254.69368: done getting variables 11044 1726853254.69432: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853254.69540: variable 'profile' from source: include params 11044 1726853254.69544: variable 'item' from source: include params 11044 1726853254.69603: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:27:34 -0400 (0:00:00.053) 0:00:19.072 ****** 11044 1726853254.69636: entering _queue_task() for managed_node1/set_fact 11044 1726853254.69978: worker is 1 (out of 1 available) 11044 1726853254.69992: exiting _queue_task() for managed_node1/set_fact 11044 1726853254.70006: done queuing things up, now waiting for results queue to drain 11044 1726853254.70007: waiting for pending results... 11044 1726853254.70286: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0 11044 1726853254.70416: in run() - task 02083763-bbaf-c5a6-f857-0000000003bc 11044 1726853254.70438: variable 'ansible_search_path' from source: unknown 11044 1726853254.70447: variable 'ansible_search_path' from source: unknown 11044 1726853254.70489: calling self._execute() 11044 1726853254.70592: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.70606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.70627: variable 'omit' from source: magic vars 11044 1726853254.70990: variable 'ansible_distribution_major_version' from source: facts 11044 1726853254.71009: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853254.71136: variable 'profile_stat' from source: set_fact 11044 1726853254.71155: Evaluated conditional (profile_stat.stat.exists): False 11044 1726853254.71167: when evaluation is False, skipping this task 11044 1726853254.71176: _execute() done 11044 1726853254.71183: dumping result to json 11044 1726853254.71190: done dumping result, returning 11044 1726853254.71274: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0 [02083763-bbaf-c5a6-f857-0000000003bc] 11044 1726853254.71277: sending task result for task 02083763-bbaf-c5a6-f857-0000000003bc 11044 1726853254.71343: done sending task result for task 02083763-bbaf-c5a6-f857-0000000003bc 11044 1726853254.71346: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11044 1726853254.71417: no more pending results, returning what we have 11044 1726853254.71422: results queue empty 11044 1726853254.71423: checking for any_errors_fatal 11044 1726853254.71432: done checking for any_errors_fatal 11044 1726853254.71433: checking for max_fail_percentage 11044 1726853254.71435: done checking for max_fail_percentage 11044 1726853254.71436: checking to see if all hosts have failed and the running result is not ok 11044 1726853254.71437: done checking to see if all hosts have failed 11044 1726853254.71437: getting the remaining hosts for this loop 11044 1726853254.71438: done getting the remaining hosts for this loop 11044 1726853254.71442: getting the next task for host managed_node1 11044 1726853254.71450: done getting next task for host managed_node1 11044 1726853254.71453: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11044 1726853254.71456: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853254.71461: getting variables 11044 1726853254.71463: in VariableManager get_vars() 11044 1726853254.71509: Calling all_inventory to load vars for managed_node1 11044 1726853254.71512: Calling groups_inventory to load vars for managed_node1 11044 1726853254.71515: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853254.71528: Calling all_plugins_play to load vars for managed_node1 11044 1726853254.71531: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853254.71534: Calling groups_plugins_play to load vars for managed_node1 11044 1726853254.73036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853254.74549: done with get_vars() 11044 1726853254.74578: done getting variables 11044 1726853254.74642: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853254.74758: variable 'profile' from source: include params 11044 1726853254.74762: variable 'item' from source: include params 11044 1726853254.74820: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:27:34 -0400 (0:00:00.052) 0:00:19.124 ****** 11044 1726853254.74850: entering _queue_task() for managed_node1/assert 11044 1726853254.75386: worker is 1 (out of 1 available) 11044 1726853254.75396: exiting _queue_task() for managed_node1/assert 11044 1726853254.75407: done queuing things up, now waiting for results queue to drain 11044 1726853254.75408: waiting for pending results... 11044 1726853254.75538: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0' 11044 1726853254.75614: in run() - task 02083763-bbaf-c5a6-f857-000000000261 11044 1726853254.75743: variable 'ansible_search_path' from source: unknown 11044 1726853254.75747: variable 'ansible_search_path' from source: unknown 11044 1726853254.75751: calling self._execute() 11044 1726853254.75792: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.75804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.75817: variable 'omit' from source: magic vars 11044 1726853254.76477: variable 'ansible_distribution_major_version' from source: facts 11044 1726853254.76498: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853254.76509: variable 'omit' from source: magic vars 11044 1726853254.76555: variable 'omit' from source: magic vars 11044 1726853254.76670: variable 'profile' from source: include params 11044 1726853254.76683: variable 'item' from source: include params 11044 1726853254.76752: variable 'item' from source: include params 11044 1726853254.76781: variable 'omit' from source: magic vars 11044 1726853254.76830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853254.76879: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853254.76907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853254.76930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853254.76948: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853254.77077: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853254.77081: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.77083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.77116: Set connection var ansible_timeout to 10 11044 1726853254.77129: Set connection var ansible_shell_executable to /bin/sh 11044 1726853254.77135: Set connection var ansible_shell_type to sh 11044 1726853254.77145: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853254.77156: Set connection var ansible_connection to ssh 11044 1726853254.77167: Set connection var ansible_pipelining to False 11044 1726853254.77205: variable 'ansible_shell_executable' from source: unknown 11044 1726853254.77214: variable 'ansible_connection' from source: unknown 11044 1726853254.77222: variable 'ansible_module_compression' from source: unknown 11044 1726853254.77229: variable 'ansible_shell_type' from source: unknown 11044 1726853254.77237: variable 'ansible_shell_executable' from source: unknown 11044 1726853254.77244: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.77252: variable 'ansible_pipelining' from source: unknown 11044 1726853254.77259: variable 'ansible_timeout' from source: unknown 11044 1726853254.77266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.77408: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853254.77428: variable 'omit' from source: magic vars 11044 1726853254.77510: starting attempt loop 11044 1726853254.77513: running the handler 11044 1726853254.77557: variable 'lsr_net_profile_exists' from source: set_fact 11044 1726853254.77566: Evaluated conditional (lsr_net_profile_exists): True 11044 1726853254.77579: handler run complete 11044 1726853254.77600: attempt loop complete, returning result 11044 1726853254.77609: _execute() done 11044 1726853254.77622: dumping result to json 11044 1726853254.77630: done dumping result, returning 11044 1726853254.77643: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0' [02083763-bbaf-c5a6-f857-000000000261] 11044 1726853254.77652: sending task result for task 02083763-bbaf-c5a6-f857-000000000261 11044 1726853254.77876: done sending task result for task 02083763-bbaf-c5a6-f857-000000000261 11044 1726853254.77880: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11044 1726853254.77930: no more pending results, returning what we have 11044 1726853254.77933: results queue empty 11044 1726853254.77935: checking for any_errors_fatal 11044 1726853254.77942: done checking for any_errors_fatal 11044 1726853254.77943: checking for max_fail_percentage 11044 1726853254.77945: done checking for max_fail_percentage 11044 1726853254.77946: checking to see if all hosts have failed and the running result is not ok 11044 1726853254.77947: done checking to see if all hosts have failed 11044 1726853254.77948: getting the remaining hosts for this loop 11044 1726853254.77949: done getting the remaining hosts for this loop 11044 1726853254.77952: getting the next task for host managed_node1 11044 1726853254.77959: done getting next task for host managed_node1 11044 1726853254.77962: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11044 1726853254.77965: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853254.77969: getting variables 11044 1726853254.77972: in VariableManager get_vars() 11044 1726853254.78020: Calling all_inventory to load vars for managed_node1 11044 1726853254.78023: Calling groups_inventory to load vars for managed_node1 11044 1726853254.78026: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853254.78038: Calling all_plugins_play to load vars for managed_node1 11044 1726853254.78041: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853254.78044: Calling groups_plugins_play to load vars for managed_node1 11044 1726853254.80681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853254.82896: done with get_vars() 11044 1726853254.82927: done getting variables 11044 1726853254.82995: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853254.83121: variable 'profile' from source: include params 11044 1726853254.83126: variable 'item' from source: include params 11044 1726853254.83189: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:27:34 -0400 (0:00:00.083) 0:00:19.207 ****** 11044 1726853254.83228: entering _queue_task() for managed_node1/assert 11044 1726853254.83628: worker is 1 (out of 1 available) 11044 1726853254.83646: exiting _queue_task() for managed_node1/assert 11044 1726853254.83658: done queuing things up, now waiting for results queue to drain 11044 1726853254.83660: waiting for pending results... 11044 1726853254.83807: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0' 11044 1726853254.83910: in run() - task 02083763-bbaf-c5a6-f857-000000000262 11044 1726853254.84055: variable 'ansible_search_path' from source: unknown 11044 1726853254.84060: variable 'ansible_search_path' from source: unknown 11044 1726853254.84063: calling self._execute() 11044 1726853254.84066: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.84069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.84076: variable 'omit' from source: magic vars 11044 1726853254.84418: variable 'ansible_distribution_major_version' from source: facts 11044 1726853254.84481: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853254.84486: variable 'omit' from source: magic vars 11044 1726853254.84489: variable 'omit' from source: magic vars 11044 1726853254.84573: variable 'profile' from source: include params 11044 1726853254.84577: variable 'item' from source: include params 11044 1726853254.84700: variable 'item' from source: include params 11044 1726853254.84704: variable 'omit' from source: magic vars 11044 1726853254.84708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853254.84732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853254.84753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853254.84770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853254.84813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853254.84821: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853254.84824: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.84826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.84898: Set connection var ansible_timeout to 10 11044 1726853254.84904: Set connection var ansible_shell_executable to /bin/sh 11044 1726853254.84907: Set connection var ansible_shell_type to sh 11044 1726853254.84946: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853254.84951: Set connection var ansible_connection to ssh 11044 1726853254.84954: Set connection var ansible_pipelining to False 11044 1726853254.84957: variable 'ansible_shell_executable' from source: unknown 11044 1726853254.84959: variable 'ansible_connection' from source: unknown 11044 1726853254.84962: variable 'ansible_module_compression' from source: unknown 11044 1726853254.84964: variable 'ansible_shell_type' from source: unknown 11044 1726853254.84966: variable 'ansible_shell_executable' from source: unknown 11044 1726853254.84968: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.84970: variable 'ansible_pipelining' from source: unknown 11044 1726853254.84974: variable 'ansible_timeout' from source: unknown 11044 1726853254.84976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.85088: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853254.85092: variable 'omit' from source: magic vars 11044 1726853254.85112: starting attempt loop 11044 1726853254.85115: running the handler 11044 1726853254.85230: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11044 1726853254.85234: Evaluated conditional (lsr_net_profile_ansible_managed): True 11044 1726853254.85237: handler run complete 11044 1726853254.85240: attempt loop complete, returning result 11044 1726853254.85242: _execute() done 11044 1726853254.85247: dumping result to json 11044 1726853254.85249: done dumping result, returning 11044 1726853254.85251: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0' [02083763-bbaf-c5a6-f857-000000000262] 11044 1726853254.85254: sending task result for task 02083763-bbaf-c5a6-f857-000000000262 11044 1726853254.85474: done sending task result for task 02083763-bbaf-c5a6-f857-000000000262 11044 1726853254.85477: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11044 1726853254.85528: no more pending results, returning what we have 11044 1726853254.85531: results queue empty 11044 1726853254.85533: checking for any_errors_fatal 11044 1726853254.85538: done checking for any_errors_fatal 11044 1726853254.85538: checking for max_fail_percentage 11044 1726853254.85540: done checking for max_fail_percentage 11044 1726853254.85541: checking to see if all hosts have failed and the running result is not ok 11044 1726853254.85542: done checking to see if all hosts have failed 11044 1726853254.85543: getting the remaining hosts for this loop 11044 1726853254.85547: done getting the remaining hosts for this loop 11044 1726853254.85551: getting the next task for host managed_node1 11044 1726853254.85557: done getting next task for host managed_node1 11044 1726853254.85560: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11044 1726853254.85563: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853254.85567: getting variables 11044 1726853254.85579: in VariableManager get_vars() 11044 1726853254.85617: Calling all_inventory to load vars for managed_node1 11044 1726853254.85619: Calling groups_inventory to load vars for managed_node1 11044 1726853254.85622: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853254.85631: Calling all_plugins_play to load vars for managed_node1 11044 1726853254.85634: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853254.85636: Calling groups_plugins_play to load vars for managed_node1 11044 1726853254.86747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853254.87697: done with get_vars() 11044 1726853254.87714: done getting variables 11044 1726853254.87762: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853254.87849: variable 'profile' from source: include params 11044 1726853254.87853: variable 'item' from source: include params 11044 1726853254.87896: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:27:34 -0400 (0:00:00.047) 0:00:19.255 ****** 11044 1726853254.87924: entering _queue_task() for managed_node1/assert 11044 1726853254.88194: worker is 1 (out of 1 available) 11044 1726853254.88207: exiting _queue_task() for managed_node1/assert 11044 1726853254.88223: done queuing things up, now waiting for results queue to drain 11044 1726853254.88225: waiting for pending results... 11044 1726853254.88515: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0 11044 1726853254.88688: in run() - task 02083763-bbaf-c5a6-f857-000000000263 11044 1726853254.88693: variable 'ansible_search_path' from source: unknown 11044 1726853254.88695: variable 'ansible_search_path' from source: unknown 11044 1726853254.88700: calling self._execute() 11044 1726853254.88780: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.88791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.88805: variable 'omit' from source: magic vars 11044 1726853254.89254: variable 'ansible_distribution_major_version' from source: facts 11044 1726853254.89269: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853254.89278: variable 'omit' from source: magic vars 11044 1726853254.89322: variable 'omit' from source: magic vars 11044 1726853254.89414: variable 'profile' from source: include params 11044 1726853254.89417: variable 'item' from source: include params 11044 1726853254.89466: variable 'item' from source: include params 11044 1726853254.89486: variable 'omit' from source: magic vars 11044 1726853254.89519: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853254.89545: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853254.89564: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853254.89580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853254.89591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853254.89623: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853254.89627: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.89629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.89714: Set connection var ansible_timeout to 10 11044 1726853254.89719: Set connection var ansible_shell_executable to /bin/sh 11044 1726853254.89721: Set connection var ansible_shell_type to sh 11044 1726853254.89724: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853254.89726: Set connection var ansible_connection to ssh 11044 1726853254.89728: Set connection var ansible_pipelining to False 11044 1726853254.89746: variable 'ansible_shell_executable' from source: unknown 11044 1726853254.89751: variable 'ansible_connection' from source: unknown 11044 1726853254.89754: variable 'ansible_module_compression' from source: unknown 11044 1726853254.89756: variable 'ansible_shell_type' from source: unknown 11044 1726853254.89758: variable 'ansible_shell_executable' from source: unknown 11044 1726853254.89762: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.89766: variable 'ansible_pipelining' from source: unknown 11044 1726853254.89769: variable 'ansible_timeout' from source: unknown 11044 1726853254.89773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.89878: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853254.89887: variable 'omit' from source: magic vars 11044 1726853254.89892: starting attempt loop 11044 1726853254.89895: running the handler 11044 1726853254.89975: variable 'lsr_net_profile_fingerprint' from source: set_fact 11044 1726853254.89979: Evaluated conditional (lsr_net_profile_fingerprint): True 11044 1726853254.89985: handler run complete 11044 1726853254.89996: attempt loop complete, returning result 11044 1726853254.89999: _execute() done 11044 1726853254.90002: dumping result to json 11044 1726853254.90004: done dumping result, returning 11044 1726853254.90010: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0 [02083763-bbaf-c5a6-f857-000000000263] 11044 1726853254.90013: sending task result for task 02083763-bbaf-c5a6-f857-000000000263 11044 1726853254.90099: done sending task result for task 02083763-bbaf-c5a6-f857-000000000263 11044 1726853254.90102: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11044 1726853254.90150: no more pending results, returning what we have 11044 1726853254.90153: results queue empty 11044 1726853254.90154: checking for any_errors_fatal 11044 1726853254.90161: done checking for any_errors_fatal 11044 1726853254.90161: checking for max_fail_percentage 11044 1726853254.90163: done checking for max_fail_percentage 11044 1726853254.90164: checking to see if all hosts have failed and the running result is not ok 11044 1726853254.90165: done checking to see if all hosts have failed 11044 1726853254.90166: getting the remaining hosts for this loop 11044 1726853254.90167: done getting the remaining hosts for this loop 11044 1726853254.90172: getting the next task for host managed_node1 11044 1726853254.90181: done getting next task for host managed_node1 11044 1726853254.90184: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11044 1726853254.90187: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853254.90190: getting variables 11044 1726853254.90191: in VariableManager get_vars() 11044 1726853254.90235: Calling all_inventory to load vars for managed_node1 11044 1726853254.90237: Calling groups_inventory to load vars for managed_node1 11044 1726853254.90240: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853254.90250: Calling all_plugins_play to load vars for managed_node1 11044 1726853254.90252: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853254.90255: Calling groups_plugins_play to load vars for managed_node1 11044 1726853254.91027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853254.91874: done with get_vars() 11044 1726853254.91890: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:27:34 -0400 (0:00:00.040) 0:00:19.295 ****** 11044 1726853254.91957: entering _queue_task() for managed_node1/include_tasks 11044 1726853254.92207: worker is 1 (out of 1 available) 11044 1726853254.92223: exiting _queue_task() for managed_node1/include_tasks 11044 1726853254.92236: done queuing things up, now waiting for results queue to drain 11044 1726853254.92238: waiting for pending results... 11044 1726853254.92412: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 11044 1726853254.92480: in run() - task 02083763-bbaf-c5a6-f857-000000000267 11044 1726853254.92492: variable 'ansible_search_path' from source: unknown 11044 1726853254.92496: variable 'ansible_search_path' from source: unknown 11044 1726853254.92521: calling self._execute() 11044 1726853254.92598: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.92603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.92612: variable 'omit' from source: magic vars 11044 1726853254.92886: variable 'ansible_distribution_major_version' from source: facts 11044 1726853254.92899: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853254.92908: _execute() done 11044 1726853254.92911: dumping result to json 11044 1726853254.92914: done dumping result, returning 11044 1726853254.92917: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-c5a6-f857-000000000267] 11044 1726853254.92920: sending task result for task 02083763-bbaf-c5a6-f857-000000000267 11044 1726853254.93000: done sending task result for task 02083763-bbaf-c5a6-f857-000000000267 11044 1726853254.93003: WORKER PROCESS EXITING 11044 1726853254.93033: no more pending results, returning what we have 11044 1726853254.93037: in VariableManager get_vars() 11044 1726853254.93085: Calling all_inventory to load vars for managed_node1 11044 1726853254.93089: Calling groups_inventory to load vars for managed_node1 11044 1726853254.93091: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853254.93103: Calling all_plugins_play to load vars for managed_node1 11044 1726853254.93106: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853254.93108: Calling groups_plugins_play to load vars for managed_node1 11044 1726853254.93986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853254.94827: done with get_vars() 11044 1726853254.94841: variable 'ansible_search_path' from source: unknown 11044 1726853254.94842: variable 'ansible_search_path' from source: unknown 11044 1726853254.94868: we have included files to process 11044 1726853254.94869: generating all_blocks data 11044 1726853254.94872: done generating all_blocks data 11044 1726853254.94876: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11044 1726853254.94877: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11044 1726853254.94878: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11044 1726853254.95469: done processing included file 11044 1726853254.95472: iterating over new_blocks loaded from include file 11044 1726853254.95473: in VariableManager get_vars() 11044 1726853254.95487: done with get_vars() 11044 1726853254.95488: filtering new block on tags 11044 1726853254.95503: done filtering new block on tags 11044 1726853254.95505: in VariableManager get_vars() 11044 1726853254.95515: done with get_vars() 11044 1726853254.95516: filtering new block on tags 11044 1726853254.95529: done filtering new block on tags 11044 1726853254.95530: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 11044 1726853254.95534: extending task lists for all hosts with included blocks 11044 1726853254.95635: done extending task lists 11044 1726853254.95636: done processing included files 11044 1726853254.95636: results queue empty 11044 1726853254.95637: checking for any_errors_fatal 11044 1726853254.95639: done checking for any_errors_fatal 11044 1726853254.95639: checking for max_fail_percentage 11044 1726853254.95640: done checking for max_fail_percentage 11044 1726853254.95641: checking to see if all hosts have failed and the running result is not ok 11044 1726853254.95641: done checking to see if all hosts have failed 11044 1726853254.95642: getting the remaining hosts for this loop 11044 1726853254.95642: done getting the remaining hosts for this loop 11044 1726853254.95644: getting the next task for host managed_node1 11044 1726853254.95647: done getting next task for host managed_node1 11044 1726853254.95648: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11044 1726853254.95650: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853254.95652: getting variables 11044 1726853254.95652: in VariableManager get_vars() 11044 1726853254.95661: Calling all_inventory to load vars for managed_node1 11044 1726853254.95663: Calling groups_inventory to load vars for managed_node1 11044 1726853254.95664: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853254.95669: Calling all_plugins_play to load vars for managed_node1 11044 1726853254.95673: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853254.95675: Calling groups_plugins_play to load vars for managed_node1 11044 1726853254.96316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853254.97153: done with get_vars() 11044 1726853254.97168: done getting variables 11044 1726853254.97202: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:27:34 -0400 (0:00:00.052) 0:00:19.348 ****** 11044 1726853254.97223: entering _queue_task() for managed_node1/set_fact 11044 1726853254.97489: worker is 1 (out of 1 available) 11044 1726853254.97503: exiting _queue_task() for managed_node1/set_fact 11044 1726853254.97517: done queuing things up, now waiting for results queue to drain 11044 1726853254.97518: waiting for pending results... 11044 1726853254.97703: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 11044 1726853254.97783: in run() - task 02083763-bbaf-c5a6-f857-0000000003fb 11044 1726853254.97795: variable 'ansible_search_path' from source: unknown 11044 1726853254.97799: variable 'ansible_search_path' from source: unknown 11044 1726853254.97828: calling self._execute() 11044 1726853254.97903: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.97910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.97918: variable 'omit' from source: magic vars 11044 1726853254.98193: variable 'ansible_distribution_major_version' from source: facts 11044 1726853254.98204: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853254.98210: variable 'omit' from source: magic vars 11044 1726853254.98240: variable 'omit' from source: magic vars 11044 1726853254.98268: variable 'omit' from source: magic vars 11044 1726853254.98304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853254.98331: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853254.98350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853254.98364: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853254.98376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853254.98402: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853254.98406: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.98408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.98478: Set connection var ansible_timeout to 10 11044 1726853254.98485: Set connection var ansible_shell_executable to /bin/sh 11044 1726853254.98487: Set connection var ansible_shell_type to sh 11044 1726853254.98493: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853254.98499: Set connection var ansible_connection to ssh 11044 1726853254.98501: Set connection var ansible_pipelining to False 11044 1726853254.98522: variable 'ansible_shell_executable' from source: unknown 11044 1726853254.98525: variable 'ansible_connection' from source: unknown 11044 1726853254.98528: variable 'ansible_module_compression' from source: unknown 11044 1726853254.98530: variable 'ansible_shell_type' from source: unknown 11044 1726853254.98533: variable 'ansible_shell_executable' from source: unknown 11044 1726853254.98535: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853254.98538: variable 'ansible_pipelining' from source: unknown 11044 1726853254.98540: variable 'ansible_timeout' from source: unknown 11044 1726853254.98543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853254.98646: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853254.98656: variable 'omit' from source: magic vars 11044 1726853254.98661: starting attempt loop 11044 1726853254.98664: running the handler 11044 1726853254.98676: handler run complete 11044 1726853254.98684: attempt loop complete, returning result 11044 1726853254.98687: _execute() done 11044 1726853254.98689: dumping result to json 11044 1726853254.98691: done dumping result, returning 11044 1726853254.98697: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-c5a6-f857-0000000003fb] 11044 1726853254.98702: sending task result for task 02083763-bbaf-c5a6-f857-0000000003fb 11044 1726853254.98779: done sending task result for task 02083763-bbaf-c5a6-f857-0000000003fb 11044 1726853254.98782: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11044 1726853254.98831: no more pending results, returning what we have 11044 1726853254.98834: results queue empty 11044 1726853254.98835: checking for any_errors_fatal 11044 1726853254.98837: done checking for any_errors_fatal 11044 1726853254.98838: checking for max_fail_percentage 11044 1726853254.98840: done checking for max_fail_percentage 11044 1726853254.98840: checking to see if all hosts have failed and the running result is not ok 11044 1726853254.98841: done checking to see if all hosts have failed 11044 1726853254.98842: getting the remaining hosts for this loop 11044 1726853254.98843: done getting the remaining hosts for this loop 11044 1726853254.98846: getting the next task for host managed_node1 11044 1726853254.98853: done getting next task for host managed_node1 11044 1726853254.98855: ^ task is: TASK: Stat profile file 11044 1726853254.98859: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853254.98863: getting variables 11044 1726853254.98864: in VariableManager get_vars() 11044 1726853254.98905: Calling all_inventory to load vars for managed_node1 11044 1726853254.98908: Calling groups_inventory to load vars for managed_node1 11044 1726853254.98911: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853254.98921: Calling all_plugins_play to load vars for managed_node1 11044 1726853254.98923: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853254.98925: Calling groups_plugins_play to load vars for managed_node1 11044 1726853254.99742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853255.00589: done with get_vars() 11044 1726853255.00605: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:27:35 -0400 (0:00:00.034) 0:00:19.382 ****** 11044 1726853255.00669: entering _queue_task() for managed_node1/stat 11044 1726853255.00912: worker is 1 (out of 1 available) 11044 1726853255.00927: exiting _queue_task() for managed_node1/stat 11044 1726853255.00940: done queuing things up, now waiting for results queue to drain 11044 1726853255.00941: waiting for pending results... 11044 1726853255.01118: running TaskExecutor() for managed_node1/TASK: Stat profile file 11044 1726853255.01192: in run() - task 02083763-bbaf-c5a6-f857-0000000003fc 11044 1726853255.01203: variable 'ansible_search_path' from source: unknown 11044 1726853255.01207: variable 'ansible_search_path' from source: unknown 11044 1726853255.01234: calling self._execute() 11044 1726853255.01310: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853255.01314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853255.01322: variable 'omit' from source: magic vars 11044 1726853255.01599: variable 'ansible_distribution_major_version' from source: facts 11044 1726853255.01610: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853255.01613: variable 'omit' from source: magic vars 11044 1726853255.01645: variable 'omit' from source: magic vars 11044 1726853255.01716: variable 'profile' from source: include params 11044 1726853255.01720: variable 'item' from source: include params 11044 1726853255.01765: variable 'item' from source: include params 11044 1726853255.01781: variable 'omit' from source: magic vars 11044 1726853255.01814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853255.01844: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853255.01863: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853255.01877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853255.01888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853255.01912: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853255.01915: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853255.01918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853255.01991: Set connection var ansible_timeout to 10 11044 1726853255.01998: Set connection var ansible_shell_executable to /bin/sh 11044 1726853255.02001: Set connection var ansible_shell_type to sh 11044 1726853255.02005: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853255.02010: Set connection var ansible_connection to ssh 11044 1726853255.02015: Set connection var ansible_pipelining to False 11044 1726853255.02034: variable 'ansible_shell_executable' from source: unknown 11044 1726853255.02038: variable 'ansible_connection' from source: unknown 11044 1726853255.02040: variable 'ansible_module_compression' from source: unknown 11044 1726853255.02043: variable 'ansible_shell_type' from source: unknown 11044 1726853255.02046: variable 'ansible_shell_executable' from source: unknown 11044 1726853255.02049: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853255.02058: variable 'ansible_pipelining' from source: unknown 11044 1726853255.02060: variable 'ansible_timeout' from source: unknown 11044 1726853255.02062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853255.02206: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11044 1726853255.02213: variable 'omit' from source: magic vars 11044 1726853255.02219: starting attempt loop 11044 1726853255.02222: running the handler 11044 1726853255.02234: _low_level_execute_command(): starting 11044 1726853255.02241: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853255.02738: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853255.02766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.02770: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853255.02776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.02829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853255.02832: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853255.02834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853255.02892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853255.04681: stdout chunk (state=3): >>>/root <<< 11044 1726853255.04780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853255.04813: stderr chunk (state=3): >>><<< 11044 1726853255.04816: stdout chunk (state=3): >>><<< 11044 1726853255.04839: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853255.04853: _low_level_execute_command(): starting 11044 1726853255.04859: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853255.0483835-12050-148174343617250 `" && echo ansible-tmp-1726853255.0483835-12050-148174343617250="` echo /root/.ansible/tmp/ansible-tmp-1726853255.0483835-12050-148174343617250 `" ) && sleep 0' 11044 1726853255.05319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853255.05324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853255.05333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.05336: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853255.05338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.05383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853255.05390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853255.05393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853255.05433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853255.07326: stdout chunk (state=3): >>>ansible-tmp-1726853255.0483835-12050-148174343617250=/root/.ansible/tmp/ansible-tmp-1726853255.0483835-12050-148174343617250 <<< 11044 1726853255.07426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853255.07455: stderr chunk (state=3): >>><<< 11044 1726853255.07458: stdout chunk (state=3): >>><<< 11044 1726853255.07476: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853255.0483835-12050-148174343617250=/root/.ansible/tmp/ansible-tmp-1726853255.0483835-12050-148174343617250 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853255.07519: variable 'ansible_module_compression' from source: unknown 11044 1726853255.07562: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11044 1726853255.07595: variable 'ansible_facts' from source: unknown 11044 1726853255.07657: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853255.0483835-12050-148174343617250/AnsiballZ_stat.py 11044 1726853255.07760: Sending initial data 11044 1726853255.07763: Sent initial data (153 bytes) 11044 1726853255.08222: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853255.08226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853255.08228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.08230: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853255.08232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.08284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853255.08287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853255.08332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853255.09864: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11044 1726853255.09868: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853255.09918: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853255.09959: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmp82krph0z /root/.ansible/tmp/ansible-tmp-1726853255.0483835-12050-148174343617250/AnsiballZ_stat.py <<< 11044 1726853255.09964: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853255.0483835-12050-148174343617250/AnsiballZ_stat.py" <<< 11044 1726853255.10003: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmp82krph0z" to remote "/root/.ansible/tmp/ansible-tmp-1726853255.0483835-12050-148174343617250/AnsiballZ_stat.py" <<< 11044 1726853255.10005: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853255.0483835-12050-148174343617250/AnsiballZ_stat.py" <<< 11044 1726853255.10529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853255.10575: stderr chunk (state=3): >>><<< 11044 1726853255.10578: stdout chunk (state=3): >>><<< 11044 1726853255.10609: done transferring module to remote 11044 1726853255.10620: _low_level_execute_command(): starting 11044 1726853255.10628: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853255.0483835-12050-148174343617250/ /root/.ansible/tmp/ansible-tmp-1726853255.0483835-12050-148174343617250/AnsiballZ_stat.py && sleep 0' 11044 1726853255.11086: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853255.11089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853255.11091: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.11093: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853255.11099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.11146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853255.11149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853255.11151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853255.11197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853255.12935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853255.12962: stderr chunk (state=3): >>><<< 11044 1726853255.12965: stdout chunk (state=3): >>><<< 11044 1726853255.12981: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853255.12984: _low_level_execute_command(): starting 11044 1726853255.12990: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853255.0483835-12050-148174343617250/AnsiballZ_stat.py && sleep 0' 11044 1726853255.13437: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853255.13440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853255.13447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11044 1726853255.13450: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853255.13452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.13501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853255.13504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853255.13510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853255.13555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853255.29222: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11044 1726853255.31145: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853255.31176: stderr chunk (state=3): >>><<< 11044 1726853255.31179: stdout chunk (state=3): >>><<< 11044 1726853255.31195: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853255.31221: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853255.0483835-12050-148174343617250/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853255.31229: _low_level_execute_command(): starting 11044 1726853255.31235: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853255.0483835-12050-148174343617250/ > /dev/null 2>&1 && sleep 0' 11044 1726853255.31704: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853255.31707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.31709: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853255.31717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.31778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853255.31782: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853255.31785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853255.31854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853255.33964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853255.33991: stderr chunk (state=3): >>><<< 11044 1726853255.33994: stdout chunk (state=3): >>><<< 11044 1726853255.34009: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853255.34015: handler run complete 11044 1726853255.34033: attempt loop complete, returning result 11044 1726853255.34036: _execute() done 11044 1726853255.34038: dumping result to json 11044 1726853255.34040: done dumping result, returning 11044 1726853255.34051: done running TaskExecutor() for managed_node1/TASK: Stat profile file [02083763-bbaf-c5a6-f857-0000000003fc] 11044 1726853255.34054: sending task result for task 02083763-bbaf-c5a6-f857-0000000003fc 11044 1726853255.34152: done sending task result for task 02083763-bbaf-c5a6-f857-0000000003fc 11044 1726853255.34157: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 11044 1726853255.34214: no more pending results, returning what we have 11044 1726853255.34216: results queue empty 11044 1726853255.34218: checking for any_errors_fatal 11044 1726853255.34226: done checking for any_errors_fatal 11044 1726853255.34227: checking for max_fail_percentage 11044 1726853255.34228: done checking for max_fail_percentage 11044 1726853255.34229: checking to see if all hosts have failed and the running result is not ok 11044 1726853255.34230: done checking to see if all hosts have failed 11044 1726853255.34231: getting the remaining hosts for this loop 11044 1726853255.34232: done getting the remaining hosts for this loop 11044 1726853255.34235: getting the next task for host managed_node1 11044 1726853255.34241: done getting next task for host managed_node1 11044 1726853255.34246: ^ task is: TASK: Set NM profile exist flag based on the profile files 11044 1726853255.34250: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853255.34253: getting variables 11044 1726853255.34255: in VariableManager get_vars() 11044 1726853255.34304: Calling all_inventory to load vars for managed_node1 11044 1726853255.34307: Calling groups_inventory to load vars for managed_node1 11044 1726853255.34309: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853255.34320: Calling all_plugins_play to load vars for managed_node1 11044 1726853255.34323: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853255.34325: Calling groups_plugins_play to load vars for managed_node1 11044 1726853255.35676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853255.42488: done with get_vars() 11044 1726853255.42518: done getting variables 11044 1726853255.42556: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:27:35 -0400 (0:00:00.419) 0:00:19.801 ****** 11044 1726853255.42577: entering _queue_task() for managed_node1/set_fact 11044 1726853255.42849: worker is 1 (out of 1 available) 11044 1726853255.42862: exiting _queue_task() for managed_node1/set_fact 11044 1726853255.42877: done queuing things up, now waiting for results queue to drain 11044 1726853255.42878: waiting for pending results... 11044 1726853255.43053: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 11044 1726853255.43134: in run() - task 02083763-bbaf-c5a6-f857-0000000003fd 11044 1726853255.43147: variable 'ansible_search_path' from source: unknown 11044 1726853255.43151: variable 'ansible_search_path' from source: unknown 11044 1726853255.43180: calling self._execute() 11044 1726853255.43253: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853255.43257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853255.43266: variable 'omit' from source: magic vars 11044 1726853255.43553: variable 'ansible_distribution_major_version' from source: facts 11044 1726853255.43563: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853255.43650: variable 'profile_stat' from source: set_fact 11044 1726853255.43660: Evaluated conditional (profile_stat.stat.exists): False 11044 1726853255.43664: when evaluation is False, skipping this task 11044 1726853255.43668: _execute() done 11044 1726853255.43673: dumping result to json 11044 1726853255.43676: done dumping result, returning 11044 1726853255.43679: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-c5a6-f857-0000000003fd] 11044 1726853255.43682: sending task result for task 02083763-bbaf-c5a6-f857-0000000003fd 11044 1726853255.43773: done sending task result for task 02083763-bbaf-c5a6-f857-0000000003fd 11044 1726853255.43776: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11044 1726853255.43824: no more pending results, returning what we have 11044 1726853255.43827: results queue empty 11044 1726853255.43828: checking for any_errors_fatal 11044 1726853255.43837: done checking for any_errors_fatal 11044 1726853255.43838: checking for max_fail_percentage 11044 1726853255.43839: done checking for max_fail_percentage 11044 1726853255.43840: checking to see if all hosts have failed and the running result is not ok 11044 1726853255.43841: done checking to see if all hosts have failed 11044 1726853255.43841: getting the remaining hosts for this loop 11044 1726853255.43843: done getting the remaining hosts for this loop 11044 1726853255.43848: getting the next task for host managed_node1 11044 1726853255.43855: done getting next task for host managed_node1 11044 1726853255.43857: ^ task is: TASK: Get NM profile info 11044 1726853255.43861: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853255.43867: getting variables 11044 1726853255.43868: in VariableManager get_vars() 11044 1726853255.43911: Calling all_inventory to load vars for managed_node1 11044 1726853255.43914: Calling groups_inventory to load vars for managed_node1 11044 1726853255.43916: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853255.43926: Calling all_plugins_play to load vars for managed_node1 11044 1726853255.43929: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853255.43931: Calling groups_plugins_play to load vars for managed_node1 11044 1726853255.45068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853255.46096: done with get_vars() 11044 1726853255.46115: done getting variables 11044 1726853255.46158: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:27:35 -0400 (0:00:00.036) 0:00:19.837 ****** 11044 1726853255.46184: entering _queue_task() for managed_node1/shell 11044 1726853255.46437: worker is 1 (out of 1 available) 11044 1726853255.46449: exiting _queue_task() for managed_node1/shell 11044 1726853255.46462: done queuing things up, now waiting for results queue to drain 11044 1726853255.46463: waiting for pending results... 11044 1726853255.46644: running TaskExecutor() for managed_node1/TASK: Get NM profile info 11044 1726853255.46731: in run() - task 02083763-bbaf-c5a6-f857-0000000003fe 11044 1726853255.46743: variable 'ansible_search_path' from source: unknown 11044 1726853255.46746: variable 'ansible_search_path' from source: unknown 11044 1726853255.46780: calling self._execute() 11044 1726853255.46855: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853255.46859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853255.46868: variable 'omit' from source: magic vars 11044 1726853255.47162: variable 'ansible_distribution_major_version' from source: facts 11044 1726853255.47173: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853255.47180: variable 'omit' from source: magic vars 11044 1726853255.47209: variable 'omit' from source: magic vars 11044 1726853255.47301: variable 'profile' from source: include params 11044 1726853255.47305: variable 'item' from source: include params 11044 1726853255.47382: variable 'item' from source: include params 11044 1726853255.47438: variable 'omit' from source: magic vars 11044 1726853255.47442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853255.47468: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853255.47503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853255.47516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853255.47676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853255.47680: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853255.47683: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853255.47685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853255.47687: Set connection var ansible_timeout to 10 11044 1726853255.47689: Set connection var ansible_shell_executable to /bin/sh 11044 1726853255.47691: Set connection var ansible_shell_type to sh 11044 1726853255.47693: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853255.47707: Set connection var ansible_connection to ssh 11044 1726853255.47711: Set connection var ansible_pipelining to False 11044 1726853255.47727: variable 'ansible_shell_executable' from source: unknown 11044 1726853255.47730: variable 'ansible_connection' from source: unknown 11044 1726853255.47733: variable 'ansible_module_compression' from source: unknown 11044 1726853255.47736: variable 'ansible_shell_type' from source: unknown 11044 1726853255.47738: variable 'ansible_shell_executable' from source: unknown 11044 1726853255.47741: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853255.47743: variable 'ansible_pipelining' from source: unknown 11044 1726853255.47749: variable 'ansible_timeout' from source: unknown 11044 1726853255.47751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853255.47899: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853255.47910: variable 'omit' from source: magic vars 11044 1726853255.47927: starting attempt loop 11044 1726853255.47930: running the handler 11044 1726853255.47939: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853255.47957: _low_level_execute_command(): starting 11044 1726853255.47965: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853255.48786: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853255.48807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853255.48838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853255.48906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853255.50567: stdout chunk (state=3): >>>/root <<< 11044 1726853255.50678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853255.50742: stderr chunk (state=3): >>><<< 11044 1726853255.50747: stdout chunk (state=3): >>><<< 11044 1726853255.50764: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853255.50786: _low_level_execute_command(): starting 11044 1726853255.50819: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853255.50765-12064-72598136706638 `" && echo ansible-tmp-1726853255.50765-12064-72598136706638="` echo /root/.ansible/tmp/ansible-tmp-1726853255.50765-12064-72598136706638 `" ) && sleep 0' 11044 1726853255.51400: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853255.51406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853255.51457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853255.51460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.51463: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853255.51465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853255.51467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.51527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853255.51558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853255.51561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853255.51602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853255.53489: stdout chunk (state=3): >>>ansible-tmp-1726853255.50765-12064-72598136706638=/root/.ansible/tmp/ansible-tmp-1726853255.50765-12064-72598136706638 <<< 11044 1726853255.53598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853255.53622: stderr chunk (state=3): >>><<< 11044 1726853255.53625: stdout chunk (state=3): >>><<< 11044 1726853255.53642: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853255.50765-12064-72598136706638=/root/.ansible/tmp/ansible-tmp-1726853255.50765-12064-72598136706638 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853255.53674: variable 'ansible_module_compression' from source: unknown 11044 1726853255.53720: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11044 1726853255.53753: variable 'ansible_facts' from source: unknown 11044 1726853255.53810: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853255.50765-12064-72598136706638/AnsiballZ_command.py 11044 1726853255.53912: Sending initial data 11044 1726853255.53916: Sent initial data (153 bytes) 11044 1726853255.54557: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853255.54573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853255.54590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853255.54657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853255.56169: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11044 1726853255.56177: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853255.56206: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853255.56250: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmp2agbjzdn /root/.ansible/tmp/ansible-tmp-1726853255.50765-12064-72598136706638/AnsiballZ_command.py <<< 11044 1726853255.56254: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853255.50765-12064-72598136706638/AnsiballZ_command.py" <<< 11044 1726853255.56289: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmp2agbjzdn" to remote "/root/.ansible/tmp/ansible-tmp-1726853255.50765-12064-72598136706638/AnsiballZ_command.py" <<< 11044 1726853255.56296: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853255.50765-12064-72598136706638/AnsiballZ_command.py" <<< 11044 1726853255.56827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853255.56866: stderr chunk (state=3): >>><<< 11044 1726853255.56869: stdout chunk (state=3): >>><<< 11044 1726853255.56891: done transferring module to remote 11044 1726853255.56902: _low_level_execute_command(): starting 11044 1726853255.56909: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853255.50765-12064-72598136706638/ /root/.ansible/tmp/ansible-tmp-1726853255.50765-12064-72598136706638/AnsiballZ_command.py && sleep 0' 11044 1726853255.57351: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853255.57354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.57360: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 11044 1726853255.57363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.57402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853255.57405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853255.57455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853255.59160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853255.59188: stderr chunk (state=3): >>><<< 11044 1726853255.59191: stdout chunk (state=3): >>><<< 11044 1726853255.59204: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853255.59207: _low_level_execute_command(): starting 11044 1726853255.59213: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853255.50765-12064-72598136706638/AnsiballZ_command.py && sleep 0' 11044 1726853255.59643: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853255.59646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.59649: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853255.59651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853255.59653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.59701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853255.59706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853255.59708: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853255.59751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853255.76835: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 13:27:35.746935", "end": "2024-09-20 13:27:35.767509", "delta": "0:00:00.020574", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11044 1726853255.78584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853255.78588: stderr chunk (state=3): >>><<< 11044 1726853255.78590: stdout chunk (state=3): >>><<< 11044 1726853255.78594: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 13:27:35.746935", "end": "2024-09-20 13:27:35.767509", "delta": "0:00:00.020574", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853255.78597: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853255.50765-12064-72598136706638/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853255.78604: _low_level_execute_command(): starting 11044 1726853255.78607: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853255.50765-12064-72598136706638/ > /dev/null 2>&1 && sleep 0' 11044 1726853255.79095: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853255.79100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853255.79275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853255.79279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853255.79281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853255.79283: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853255.79285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.79287: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853255.79290: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 11044 1726853255.79292: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11044 1726853255.79294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853255.79295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853255.79297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853255.79299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853255.79300: stderr chunk (state=3): >>>debug2: match found <<< 11044 1726853255.79302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853255.79304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853255.79306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853255.79336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853255.79397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853255.81392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853255.81396: stdout chunk (state=3): >>><<< 11044 1726853255.81398: stderr chunk (state=3): >>><<< 11044 1726853255.81400: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853255.81402: handler run complete 11044 1726853255.81404: Evaluated conditional (False): False 11044 1726853255.81410: attempt loop complete, returning result 11044 1726853255.81412: _execute() done 11044 1726853255.81414: dumping result to json 11044 1726853255.81416: done dumping result, returning 11044 1726853255.81418: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [02083763-bbaf-c5a6-f857-0000000003fe] 11044 1726853255.81420: sending task result for task 02083763-bbaf-c5a6-f857-0000000003fe 11044 1726853255.81498: done sending task result for task 02083763-bbaf-c5a6-f857-0000000003fe 11044 1726853255.81501: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.020574", "end": "2024-09-20 13:27:35.767509", "rc": 0, "start": "2024-09-20 13:27:35.746935" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 11044 1726853255.81594: no more pending results, returning what we have 11044 1726853255.81598: results queue empty 11044 1726853255.81599: checking for any_errors_fatal 11044 1726853255.81607: done checking for any_errors_fatal 11044 1726853255.81608: checking for max_fail_percentage 11044 1726853255.81610: done checking for max_fail_percentage 11044 1726853255.81610: checking to see if all hosts have failed and the running result is not ok 11044 1726853255.81611: done checking to see if all hosts have failed 11044 1726853255.81612: getting the remaining hosts for this loop 11044 1726853255.81613: done getting the remaining hosts for this loop 11044 1726853255.81617: getting the next task for host managed_node1 11044 1726853255.81627: done getting next task for host managed_node1 11044 1726853255.81630: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11044 1726853255.81634: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853255.81638: getting variables 11044 1726853255.81640: in VariableManager get_vars() 11044 1726853255.81889: Calling all_inventory to load vars for managed_node1 11044 1726853255.81892: Calling groups_inventory to load vars for managed_node1 11044 1726853255.81895: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853255.81905: Calling all_plugins_play to load vars for managed_node1 11044 1726853255.81909: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853255.81912: Calling groups_plugins_play to load vars for managed_node1 11044 1726853255.83465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853255.85145: done with get_vars() 11044 1726853255.85536: done getting variables 11044 1726853255.85710: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:27:35 -0400 (0:00:00.395) 0:00:20.233 ****** 11044 1726853255.85748: entering _queue_task() for managed_node1/set_fact 11044 1726853255.86432: worker is 1 (out of 1 available) 11044 1726853255.86448: exiting _queue_task() for managed_node1/set_fact 11044 1726853255.86461: done queuing things up, now waiting for results queue to drain 11044 1726853255.86463: waiting for pending results... 11044 1726853255.86680: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11044 1726853255.86822: in run() - task 02083763-bbaf-c5a6-f857-0000000003ff 11044 1726853255.86826: variable 'ansible_search_path' from source: unknown 11044 1726853255.86829: variable 'ansible_search_path' from source: unknown 11044 1726853255.86833: calling self._execute() 11044 1726853255.86936: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853255.87042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853255.87049: variable 'omit' from source: magic vars 11044 1726853255.87539: variable 'ansible_distribution_major_version' from source: facts 11044 1726853255.87552: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853255.87687: variable 'nm_profile_exists' from source: set_fact 11044 1726853255.87703: Evaluated conditional (nm_profile_exists.rc == 0): True 11044 1726853255.87710: variable 'omit' from source: magic vars 11044 1726853255.87795: variable 'omit' from source: magic vars 11044 1726853255.87807: variable 'omit' from source: magic vars 11044 1726853255.87837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853255.87876: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853255.87902: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853255.87976: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853255.87980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853255.87983: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853255.87986: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853255.87988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853255.88068: Set connection var ansible_timeout to 10 11044 1726853255.88079: Set connection var ansible_shell_executable to /bin/sh 11044 1726853255.88082: Set connection var ansible_shell_type to sh 11044 1726853255.88089: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853255.88095: Set connection var ansible_connection to ssh 11044 1726853255.88100: Set connection var ansible_pipelining to False 11044 1726853255.88131: variable 'ansible_shell_executable' from source: unknown 11044 1726853255.88135: variable 'ansible_connection' from source: unknown 11044 1726853255.88138: variable 'ansible_module_compression' from source: unknown 11044 1726853255.88140: variable 'ansible_shell_type' from source: unknown 11044 1726853255.88142: variable 'ansible_shell_executable' from source: unknown 11044 1726853255.88147: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853255.88150: variable 'ansible_pipelining' from source: unknown 11044 1726853255.88152: variable 'ansible_timeout' from source: unknown 11044 1726853255.88155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853255.88370: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853255.88376: variable 'omit' from source: magic vars 11044 1726853255.88379: starting attempt loop 11044 1726853255.88381: running the handler 11044 1726853255.88383: handler run complete 11044 1726853255.88386: attempt loop complete, returning result 11044 1726853255.88388: _execute() done 11044 1726853255.88390: dumping result to json 11044 1726853255.88392: done dumping result, returning 11044 1726853255.88394: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-c5a6-f857-0000000003ff] 11044 1726853255.88398: sending task result for task 02083763-bbaf-c5a6-f857-0000000003ff 11044 1726853255.88461: done sending task result for task 02083763-bbaf-c5a6-f857-0000000003ff 11044 1726853255.88464: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11044 1726853255.88527: no more pending results, returning what we have 11044 1726853255.88531: results queue empty 11044 1726853255.88532: checking for any_errors_fatal 11044 1726853255.88547: done checking for any_errors_fatal 11044 1726853255.88548: checking for max_fail_percentage 11044 1726853255.88550: done checking for max_fail_percentage 11044 1726853255.88550: checking to see if all hosts have failed and the running result is not ok 11044 1726853255.88551: done checking to see if all hosts have failed 11044 1726853255.88552: getting the remaining hosts for this loop 11044 1726853255.88553: done getting the remaining hosts for this loop 11044 1726853255.88557: getting the next task for host managed_node1 11044 1726853255.88565: done getting next task for host managed_node1 11044 1726853255.88567: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11044 1726853255.88573: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853255.88577: getting variables 11044 1726853255.88578: in VariableManager get_vars() 11044 1726853255.88619: Calling all_inventory to load vars for managed_node1 11044 1726853255.88623: Calling groups_inventory to load vars for managed_node1 11044 1726853255.88625: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853255.88634: Calling all_plugins_play to load vars for managed_node1 11044 1726853255.88636: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853255.88639: Calling groups_plugins_play to load vars for managed_node1 11044 1726853255.90435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853255.91318: done with get_vars() 11044 1726853255.91335: done getting variables 11044 1726853255.91384: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853255.91476: variable 'profile' from source: include params 11044 1726853255.91479: variable 'item' from source: include params 11044 1726853255.91522: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:27:35 -0400 (0:00:00.058) 0:00:20.291 ****** 11044 1726853255.91552: entering _queue_task() for managed_node1/command 11044 1726853255.91810: worker is 1 (out of 1 available) 11044 1726853255.91825: exiting _queue_task() for managed_node1/command 11044 1726853255.91841: done queuing things up, now waiting for results queue to drain 11044 1726853255.91843: waiting for pending results... 11044 1726853255.92049: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 11044 1726853255.92162: in run() - task 02083763-bbaf-c5a6-f857-000000000401 11044 1726853255.92206: variable 'ansible_search_path' from source: unknown 11044 1726853255.92211: variable 'ansible_search_path' from source: unknown 11044 1726853255.92241: calling self._execute() 11044 1726853255.92479: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853255.92482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853255.92485: variable 'omit' from source: magic vars 11044 1726853255.92735: variable 'ansible_distribution_major_version' from source: facts 11044 1726853255.92754: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853255.92872: variable 'profile_stat' from source: set_fact 11044 1726853255.92890: Evaluated conditional (profile_stat.stat.exists): False 11044 1726853255.92893: when evaluation is False, skipping this task 11044 1726853255.92896: _execute() done 11044 1726853255.92904: dumping result to json 11044 1726853255.92907: done dumping result, returning 11044 1726853255.92910: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [02083763-bbaf-c5a6-f857-000000000401] 11044 1726853255.92927: sending task result for task 02083763-bbaf-c5a6-f857-000000000401 11044 1726853255.93006: done sending task result for task 02083763-bbaf-c5a6-f857-000000000401 11044 1726853255.93010: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11044 1726853255.93070: no more pending results, returning what we have 11044 1726853255.93075: results queue empty 11044 1726853255.93076: checking for any_errors_fatal 11044 1726853255.93082: done checking for any_errors_fatal 11044 1726853255.93082: checking for max_fail_percentage 11044 1726853255.93084: done checking for max_fail_percentage 11044 1726853255.93085: checking to see if all hosts have failed and the running result is not ok 11044 1726853255.93085: done checking to see if all hosts have failed 11044 1726853255.93086: getting the remaining hosts for this loop 11044 1726853255.93087: done getting the remaining hosts for this loop 11044 1726853255.93090: getting the next task for host managed_node1 11044 1726853255.93097: done getting next task for host managed_node1 11044 1726853255.93099: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11044 1726853255.93103: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853255.93107: getting variables 11044 1726853255.93109: in VariableManager get_vars() 11044 1726853255.93153: Calling all_inventory to load vars for managed_node1 11044 1726853255.93156: Calling groups_inventory to load vars for managed_node1 11044 1726853255.93158: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853255.93169: Calling all_plugins_play to load vars for managed_node1 11044 1726853255.93173: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853255.93175: Calling groups_plugins_play to load vars for managed_node1 11044 1726853255.94534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853255.95381: done with get_vars() 11044 1726853255.95397: done getting variables 11044 1726853255.95442: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853255.95531: variable 'profile' from source: include params 11044 1726853255.95534: variable 'item' from source: include params 11044 1726853255.95578: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:27:35 -0400 (0:00:00.040) 0:00:20.331 ****** 11044 1726853255.95601: entering _queue_task() for managed_node1/set_fact 11044 1726853255.95862: worker is 1 (out of 1 available) 11044 1726853255.95877: exiting _queue_task() for managed_node1/set_fact 11044 1726853255.95890: done queuing things up, now waiting for results queue to drain 11044 1726853255.95891: waiting for pending results... 11044 1726853255.96062: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 11044 1726853255.96148: in run() - task 02083763-bbaf-c5a6-f857-000000000402 11044 1726853255.96159: variable 'ansible_search_path' from source: unknown 11044 1726853255.96164: variable 'ansible_search_path' from source: unknown 11044 1726853255.96193: calling self._execute() 11044 1726853255.96313: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853255.96317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853255.96320: variable 'omit' from source: magic vars 11044 1726853255.96696: variable 'ansible_distribution_major_version' from source: facts 11044 1726853255.96700: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853255.96803: variable 'profile_stat' from source: set_fact 11044 1726853255.96813: Evaluated conditional (profile_stat.stat.exists): False 11044 1726853255.96816: when evaluation is False, skipping this task 11044 1726853255.96819: _execute() done 11044 1726853255.96822: dumping result to json 11044 1726853255.96825: done dumping result, returning 11044 1726853255.96832: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [02083763-bbaf-c5a6-f857-000000000402] 11044 1726853255.96835: sending task result for task 02083763-bbaf-c5a6-f857-000000000402 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11044 1726853255.97017: no more pending results, returning what we have 11044 1726853255.97025: results queue empty 11044 1726853255.97026: checking for any_errors_fatal 11044 1726853255.97036: done checking for any_errors_fatal 11044 1726853255.97037: checking for max_fail_percentage 11044 1726853255.97038: done checking for max_fail_percentage 11044 1726853255.97039: checking to see if all hosts have failed and the running result is not ok 11044 1726853255.97040: done checking to see if all hosts have failed 11044 1726853255.97041: getting the remaining hosts for this loop 11044 1726853255.97042: done getting the remaining hosts for this loop 11044 1726853255.97047: getting the next task for host managed_node1 11044 1726853255.97054: done getting next task for host managed_node1 11044 1726853255.97056: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11044 1726853255.97059: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853255.97063: getting variables 11044 1726853255.97064: in VariableManager get_vars() 11044 1726853255.97101: Calling all_inventory to load vars for managed_node1 11044 1726853255.97104: Calling groups_inventory to load vars for managed_node1 11044 1726853255.97106: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853255.97115: Calling all_plugins_play to load vars for managed_node1 11044 1726853255.97118: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853255.97120: Calling groups_plugins_play to load vars for managed_node1 11044 1726853255.97645: done sending task result for task 02083763-bbaf-c5a6-f857-000000000402 11044 1726853255.97649: WORKER PROCESS EXITING 11044 1726853255.98136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853255.99002: done with get_vars() 11044 1726853255.99017: done getting variables 11044 1726853255.99061: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853255.99144: variable 'profile' from source: include params 11044 1726853255.99148: variable 'item' from source: include params 11044 1726853255.99195: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:27:35 -0400 (0:00:00.036) 0:00:20.367 ****** 11044 1726853255.99218: entering _queue_task() for managed_node1/command 11044 1726853255.99473: worker is 1 (out of 1 available) 11044 1726853255.99486: exiting _queue_task() for managed_node1/command 11044 1726853255.99500: done queuing things up, now waiting for results queue to drain 11044 1726853255.99501: waiting for pending results... 11044 1726853255.99691: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 11044 1726853255.99768: in run() - task 02083763-bbaf-c5a6-f857-000000000403 11044 1726853255.99781: variable 'ansible_search_path' from source: unknown 11044 1726853255.99785: variable 'ansible_search_path' from source: unknown 11044 1726853255.99817: calling self._execute() 11044 1726853255.99900: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853255.99905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853255.99913: variable 'omit' from source: magic vars 11044 1726853256.00204: variable 'ansible_distribution_major_version' from source: facts 11044 1726853256.00214: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853256.00303: variable 'profile_stat' from source: set_fact 11044 1726853256.00314: Evaluated conditional (profile_stat.stat.exists): False 11044 1726853256.00318: when evaluation is False, skipping this task 11044 1726853256.00321: _execute() done 11044 1726853256.00323: dumping result to json 11044 1726853256.00326: done dumping result, returning 11044 1726853256.00332: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 [02083763-bbaf-c5a6-f857-000000000403] 11044 1726853256.00337: sending task result for task 02083763-bbaf-c5a6-f857-000000000403 11044 1726853256.00419: done sending task result for task 02083763-bbaf-c5a6-f857-000000000403 11044 1726853256.00422: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11044 1726853256.00468: no more pending results, returning what we have 11044 1726853256.00472: results queue empty 11044 1726853256.00474: checking for any_errors_fatal 11044 1726853256.00481: done checking for any_errors_fatal 11044 1726853256.00481: checking for max_fail_percentage 11044 1726853256.00483: done checking for max_fail_percentage 11044 1726853256.00483: checking to see if all hosts have failed and the running result is not ok 11044 1726853256.00484: done checking to see if all hosts have failed 11044 1726853256.00485: getting the remaining hosts for this loop 11044 1726853256.00486: done getting the remaining hosts for this loop 11044 1726853256.00489: getting the next task for host managed_node1 11044 1726853256.00496: done getting next task for host managed_node1 11044 1726853256.00498: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11044 1726853256.00502: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853256.00506: getting variables 11044 1726853256.00507: in VariableManager get_vars() 11044 1726853256.00547: Calling all_inventory to load vars for managed_node1 11044 1726853256.00550: Calling groups_inventory to load vars for managed_node1 11044 1726853256.00552: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853256.00564: Calling all_plugins_play to load vars for managed_node1 11044 1726853256.00566: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853256.00569: Calling groups_plugins_play to load vars for managed_node1 11044 1726853256.01430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853256.02283: done with get_vars() 11044 1726853256.02298: done getting variables 11044 1726853256.02341: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853256.02421: variable 'profile' from source: include params 11044 1726853256.02424: variable 'item' from source: include params 11044 1726853256.02464: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:27:36 -0400 (0:00:00.032) 0:00:20.400 ****** 11044 1726853256.02487: entering _queue_task() for managed_node1/set_fact 11044 1726853256.02744: worker is 1 (out of 1 available) 11044 1726853256.02757: exiting _queue_task() for managed_node1/set_fact 11044 1726853256.02769: done queuing things up, now waiting for results queue to drain 11044 1726853256.02772: waiting for pending results... 11044 1726853256.02952: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 11044 1726853256.03032: in run() - task 02083763-bbaf-c5a6-f857-000000000404 11044 1726853256.03045: variable 'ansible_search_path' from source: unknown 11044 1726853256.03048: variable 'ansible_search_path' from source: unknown 11044 1726853256.03080: calling self._execute() 11044 1726853256.03161: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.03164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.03174: variable 'omit' from source: magic vars 11044 1726853256.03449: variable 'ansible_distribution_major_version' from source: facts 11044 1726853256.03460: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853256.03546: variable 'profile_stat' from source: set_fact 11044 1726853256.03556: Evaluated conditional (profile_stat.stat.exists): False 11044 1726853256.03559: when evaluation is False, skipping this task 11044 1726853256.03563: _execute() done 11044 1726853256.03565: dumping result to json 11044 1726853256.03567: done dumping result, returning 11044 1726853256.03575: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [02083763-bbaf-c5a6-f857-000000000404] 11044 1726853256.03580: sending task result for task 02083763-bbaf-c5a6-f857-000000000404 11044 1726853256.03663: done sending task result for task 02083763-bbaf-c5a6-f857-000000000404 11044 1726853256.03666: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11044 1726853256.03709: no more pending results, returning what we have 11044 1726853256.03712: results queue empty 11044 1726853256.03713: checking for any_errors_fatal 11044 1726853256.03720: done checking for any_errors_fatal 11044 1726853256.03721: checking for max_fail_percentage 11044 1726853256.03723: done checking for max_fail_percentage 11044 1726853256.03723: checking to see if all hosts have failed and the running result is not ok 11044 1726853256.03724: done checking to see if all hosts have failed 11044 1726853256.03725: getting the remaining hosts for this loop 11044 1726853256.03726: done getting the remaining hosts for this loop 11044 1726853256.03729: getting the next task for host managed_node1 11044 1726853256.03736: done getting next task for host managed_node1 11044 1726853256.03738: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11044 1726853256.03742: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853256.03747: getting variables 11044 1726853256.03748: in VariableManager get_vars() 11044 1726853256.03790: Calling all_inventory to load vars for managed_node1 11044 1726853256.03793: Calling groups_inventory to load vars for managed_node1 11044 1726853256.03795: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853256.03806: Calling all_plugins_play to load vars for managed_node1 11044 1726853256.03808: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853256.03811: Calling groups_plugins_play to load vars for managed_node1 11044 1726853256.04577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853256.05525: done with get_vars() 11044 1726853256.05539: done getting variables 11044 1726853256.05582: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853256.05663: variable 'profile' from source: include params 11044 1726853256.05666: variable 'item' from source: include params 11044 1726853256.05705: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:27:36 -0400 (0:00:00.032) 0:00:20.433 ****** 11044 1726853256.05729: entering _queue_task() for managed_node1/assert 11044 1726853256.05970: worker is 1 (out of 1 available) 11044 1726853256.05987: exiting _queue_task() for managed_node1/assert 11044 1726853256.06000: done queuing things up, now waiting for results queue to drain 11044 1726853256.06002: waiting for pending results... 11044 1726853256.06183: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.0' 11044 1726853256.06248: in run() - task 02083763-bbaf-c5a6-f857-000000000268 11044 1726853256.06262: variable 'ansible_search_path' from source: unknown 11044 1726853256.06266: variable 'ansible_search_path' from source: unknown 11044 1726853256.06295: calling self._execute() 11044 1726853256.06379: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.06384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.06392: variable 'omit' from source: magic vars 11044 1726853256.06667: variable 'ansible_distribution_major_version' from source: facts 11044 1726853256.06672: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853256.06680: variable 'omit' from source: magic vars 11044 1726853256.06706: variable 'omit' from source: magic vars 11044 1726853256.06775: variable 'profile' from source: include params 11044 1726853256.06779: variable 'item' from source: include params 11044 1726853256.06824: variable 'item' from source: include params 11044 1726853256.06839: variable 'omit' from source: magic vars 11044 1726853256.06876: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853256.06904: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853256.06921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853256.06934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853256.06945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853256.06970: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853256.06974: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.06977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.07046: Set connection var ansible_timeout to 10 11044 1726853256.07055: Set connection var ansible_shell_executable to /bin/sh 11044 1726853256.07058: Set connection var ansible_shell_type to sh 11044 1726853256.07063: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853256.07068: Set connection var ansible_connection to ssh 11044 1726853256.07074: Set connection var ansible_pipelining to False 11044 1726853256.07091: variable 'ansible_shell_executable' from source: unknown 11044 1726853256.07096: variable 'ansible_connection' from source: unknown 11044 1726853256.07099: variable 'ansible_module_compression' from source: unknown 11044 1726853256.07101: variable 'ansible_shell_type' from source: unknown 11044 1726853256.07104: variable 'ansible_shell_executable' from source: unknown 11044 1726853256.07108: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.07110: variable 'ansible_pipelining' from source: unknown 11044 1726853256.07113: variable 'ansible_timeout' from source: unknown 11044 1726853256.07115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.07212: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853256.07230: variable 'omit' from source: magic vars 11044 1726853256.07236: starting attempt loop 11044 1726853256.07239: running the handler 11044 1726853256.07307: variable 'lsr_net_profile_exists' from source: set_fact 11044 1726853256.07311: Evaluated conditional (lsr_net_profile_exists): True 11044 1726853256.07317: handler run complete 11044 1726853256.07327: attempt loop complete, returning result 11044 1726853256.07330: _execute() done 11044 1726853256.07342: dumping result to json 11044 1726853256.07348: done dumping result, returning 11044 1726853256.07350: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.0' [02083763-bbaf-c5a6-f857-000000000268] 11044 1726853256.07352: sending task result for task 02083763-bbaf-c5a6-f857-000000000268 11044 1726853256.07427: done sending task result for task 02083763-bbaf-c5a6-f857-000000000268 11044 1726853256.07430: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11044 1726853256.07495: no more pending results, returning what we have 11044 1726853256.07498: results queue empty 11044 1726853256.07498: checking for any_errors_fatal 11044 1726853256.07505: done checking for any_errors_fatal 11044 1726853256.07506: checking for max_fail_percentage 11044 1726853256.07507: done checking for max_fail_percentage 11044 1726853256.07508: checking to see if all hosts have failed and the running result is not ok 11044 1726853256.07509: done checking to see if all hosts have failed 11044 1726853256.07509: getting the remaining hosts for this loop 11044 1726853256.07510: done getting the remaining hosts for this loop 11044 1726853256.07513: getting the next task for host managed_node1 11044 1726853256.07518: done getting next task for host managed_node1 11044 1726853256.07520: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11044 1726853256.07523: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853256.07526: getting variables 11044 1726853256.07528: in VariableManager get_vars() 11044 1726853256.07566: Calling all_inventory to load vars for managed_node1 11044 1726853256.07569: Calling groups_inventory to load vars for managed_node1 11044 1726853256.07573: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853256.07581: Calling all_plugins_play to load vars for managed_node1 11044 1726853256.07584: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853256.07586: Calling groups_plugins_play to load vars for managed_node1 11044 1726853256.08340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853256.09192: done with get_vars() 11044 1726853256.09207: done getting variables 11044 1726853256.09248: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853256.09329: variable 'profile' from source: include params 11044 1726853256.09332: variable 'item' from source: include params 11044 1726853256.09370: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:27:36 -0400 (0:00:00.036) 0:00:20.469 ****** 11044 1726853256.09399: entering _queue_task() for managed_node1/assert 11044 1726853256.09633: worker is 1 (out of 1 available) 11044 1726853256.09646: exiting _queue_task() for managed_node1/assert 11044 1726853256.09659: done queuing things up, now waiting for results queue to drain 11044 1726853256.09661: waiting for pending results... 11044 1726853256.09838: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' 11044 1726853256.09910: in run() - task 02083763-bbaf-c5a6-f857-000000000269 11044 1726853256.09921: variable 'ansible_search_path' from source: unknown 11044 1726853256.09925: variable 'ansible_search_path' from source: unknown 11044 1726853256.09956: calling self._execute() 11044 1726853256.10036: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.10041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.10053: variable 'omit' from source: magic vars 11044 1726853256.10325: variable 'ansible_distribution_major_version' from source: facts 11044 1726853256.10329: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853256.10336: variable 'omit' from source: magic vars 11044 1726853256.10366: variable 'omit' from source: magic vars 11044 1726853256.10437: variable 'profile' from source: include params 11044 1726853256.10442: variable 'item' from source: include params 11044 1726853256.10490: variable 'item' from source: include params 11044 1726853256.10505: variable 'omit' from source: magic vars 11044 1726853256.10542: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853256.10568: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853256.10586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853256.10600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853256.10609: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853256.10634: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853256.10637: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.10640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.10712: Set connection var ansible_timeout to 10 11044 1726853256.10718: Set connection var ansible_shell_executable to /bin/sh 11044 1726853256.10721: Set connection var ansible_shell_type to sh 11044 1726853256.10726: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853256.10731: Set connection var ansible_connection to ssh 11044 1726853256.10736: Set connection var ansible_pipelining to False 11044 1726853256.10757: variable 'ansible_shell_executable' from source: unknown 11044 1726853256.10764: variable 'ansible_connection' from source: unknown 11044 1726853256.10767: variable 'ansible_module_compression' from source: unknown 11044 1726853256.10769: variable 'ansible_shell_type' from source: unknown 11044 1726853256.10773: variable 'ansible_shell_executable' from source: unknown 11044 1726853256.10781: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.10784: variable 'ansible_pipelining' from source: unknown 11044 1726853256.10787: variable 'ansible_timeout' from source: unknown 11044 1726853256.10789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.10891: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853256.10900: variable 'omit' from source: magic vars 11044 1726853256.10905: starting attempt loop 11044 1726853256.10908: running the handler 11044 1726853256.10982: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11044 1726853256.10986: Evaluated conditional (lsr_net_profile_ansible_managed): True 11044 1726853256.10994: handler run complete 11044 1726853256.11005: attempt loop complete, returning result 11044 1726853256.11008: _execute() done 11044 1726853256.11010: dumping result to json 11044 1726853256.11013: done dumping result, returning 11044 1726853256.11018: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' [02083763-bbaf-c5a6-f857-000000000269] 11044 1726853256.11023: sending task result for task 02083763-bbaf-c5a6-f857-000000000269 11044 1726853256.11102: done sending task result for task 02083763-bbaf-c5a6-f857-000000000269 11044 1726853256.11104: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11044 1726853256.11150: no more pending results, returning what we have 11044 1726853256.11153: results queue empty 11044 1726853256.11154: checking for any_errors_fatal 11044 1726853256.11160: done checking for any_errors_fatal 11044 1726853256.11160: checking for max_fail_percentage 11044 1726853256.11162: done checking for max_fail_percentage 11044 1726853256.11162: checking to see if all hosts have failed and the running result is not ok 11044 1726853256.11163: done checking to see if all hosts have failed 11044 1726853256.11164: getting the remaining hosts for this loop 11044 1726853256.11165: done getting the remaining hosts for this loop 11044 1726853256.11168: getting the next task for host managed_node1 11044 1726853256.11175: done getting next task for host managed_node1 11044 1726853256.11177: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11044 1726853256.11181: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853256.11185: getting variables 11044 1726853256.11186: in VariableManager get_vars() 11044 1726853256.11227: Calling all_inventory to load vars for managed_node1 11044 1726853256.11230: Calling groups_inventory to load vars for managed_node1 11044 1726853256.11232: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853256.11242: Calling all_plugins_play to load vars for managed_node1 11044 1726853256.11245: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853256.11247: Calling groups_plugins_play to load vars for managed_node1 11044 1726853256.12149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853256.12985: done with get_vars() 11044 1726853256.13000: done getting variables 11044 1726853256.13042: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853256.13120: variable 'profile' from source: include params 11044 1726853256.13123: variable 'item' from source: include params 11044 1726853256.13162: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:27:36 -0400 (0:00:00.037) 0:00:20.507 ****** 11044 1726853256.13189: entering _queue_task() for managed_node1/assert 11044 1726853256.13440: worker is 1 (out of 1 available) 11044 1726853256.13453: exiting _queue_task() for managed_node1/assert 11044 1726853256.13466: done queuing things up, now waiting for results queue to drain 11044 1726853256.13467: waiting for pending results... 11044 1726853256.13648: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.0 11044 1726853256.13712: in run() - task 02083763-bbaf-c5a6-f857-00000000026a 11044 1726853256.13723: variable 'ansible_search_path' from source: unknown 11044 1726853256.13727: variable 'ansible_search_path' from source: unknown 11044 1726853256.13756: calling self._execute() 11044 1726853256.13835: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.13840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.13847: variable 'omit' from source: magic vars 11044 1726853256.14118: variable 'ansible_distribution_major_version' from source: facts 11044 1726853256.14128: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853256.14132: variable 'omit' from source: magic vars 11044 1726853256.14161: variable 'omit' from source: magic vars 11044 1726853256.14230: variable 'profile' from source: include params 11044 1726853256.14234: variable 'item' from source: include params 11044 1726853256.14282: variable 'item' from source: include params 11044 1726853256.14295: variable 'omit' from source: magic vars 11044 1726853256.14329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853256.14364: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853256.14377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853256.14391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853256.14401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853256.14425: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853256.14428: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.14431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.14504: Set connection var ansible_timeout to 10 11044 1726853256.14511: Set connection var ansible_shell_executable to /bin/sh 11044 1726853256.14513: Set connection var ansible_shell_type to sh 11044 1726853256.14518: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853256.14523: Set connection var ansible_connection to ssh 11044 1726853256.14528: Set connection var ansible_pipelining to False 11044 1726853256.14549: variable 'ansible_shell_executable' from source: unknown 11044 1726853256.14552: variable 'ansible_connection' from source: unknown 11044 1726853256.14554: variable 'ansible_module_compression' from source: unknown 11044 1726853256.14557: variable 'ansible_shell_type' from source: unknown 11044 1726853256.14559: variable 'ansible_shell_executable' from source: unknown 11044 1726853256.14561: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.14564: variable 'ansible_pipelining' from source: unknown 11044 1726853256.14566: variable 'ansible_timeout' from source: unknown 11044 1726853256.14569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.14800: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853256.14803: variable 'omit' from source: magic vars 11044 1726853256.14806: starting attempt loop 11044 1726853256.14808: running the handler 11044 1726853256.14821: variable 'lsr_net_profile_fingerprint' from source: set_fact 11044 1726853256.14827: Evaluated conditional (lsr_net_profile_fingerprint): True 11044 1726853256.14833: handler run complete 11044 1726853256.14849: attempt loop complete, returning result 11044 1726853256.14852: _execute() done 11044 1726853256.14854: dumping result to json 11044 1726853256.14857: done dumping result, returning 11044 1726853256.14862: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.0 [02083763-bbaf-c5a6-f857-00000000026a] 11044 1726853256.14866: sending task result for task 02083763-bbaf-c5a6-f857-00000000026a 11044 1726853256.14957: done sending task result for task 02083763-bbaf-c5a6-f857-00000000026a 11044 1726853256.14960: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11044 1726853256.15037: no more pending results, returning what we have 11044 1726853256.15040: results queue empty 11044 1726853256.15041: checking for any_errors_fatal 11044 1726853256.15049: done checking for any_errors_fatal 11044 1726853256.15050: checking for max_fail_percentage 11044 1726853256.15052: done checking for max_fail_percentage 11044 1726853256.15052: checking to see if all hosts have failed and the running result is not ok 11044 1726853256.15053: done checking to see if all hosts have failed 11044 1726853256.15054: getting the remaining hosts for this loop 11044 1726853256.15055: done getting the remaining hosts for this loop 11044 1726853256.15058: getting the next task for host managed_node1 11044 1726853256.15066: done getting next task for host managed_node1 11044 1726853256.15069: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11044 1726853256.15073: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853256.15077: getting variables 11044 1726853256.15078: in VariableManager get_vars() 11044 1726853256.15116: Calling all_inventory to load vars for managed_node1 11044 1726853256.15119: Calling groups_inventory to load vars for managed_node1 11044 1726853256.15121: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853256.15130: Calling all_plugins_play to load vars for managed_node1 11044 1726853256.15132: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853256.15134: Calling groups_plugins_play to load vars for managed_node1 11044 1726853256.16308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853256.17162: done with get_vars() 11044 1726853256.17182: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:27:36 -0400 (0:00:00.040) 0:00:20.548 ****** 11044 1726853256.17252: entering _queue_task() for managed_node1/include_tasks 11044 1726853256.17509: worker is 1 (out of 1 available) 11044 1726853256.17522: exiting _queue_task() for managed_node1/include_tasks 11044 1726853256.17536: done queuing things up, now waiting for results queue to drain 11044 1726853256.17537: waiting for pending results... 11044 1726853256.17726: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 11044 1726853256.17801: in run() - task 02083763-bbaf-c5a6-f857-00000000026e 11044 1726853256.17813: variable 'ansible_search_path' from source: unknown 11044 1726853256.17816: variable 'ansible_search_path' from source: unknown 11044 1726853256.17848: calling self._execute() 11044 1726853256.17926: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.17930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.17939: variable 'omit' from source: magic vars 11044 1726853256.18643: variable 'ansible_distribution_major_version' from source: facts 11044 1726853256.18648: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853256.18650: _execute() done 11044 1726853256.18653: dumping result to json 11044 1726853256.18655: done dumping result, returning 11044 1726853256.18656: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-c5a6-f857-00000000026e] 11044 1726853256.18658: sending task result for task 02083763-bbaf-c5a6-f857-00000000026e 11044 1726853256.18718: done sending task result for task 02083763-bbaf-c5a6-f857-00000000026e 11044 1726853256.18721: WORKER PROCESS EXITING 11044 1726853256.18768: no more pending results, returning what we have 11044 1726853256.18773: in VariableManager get_vars() 11044 1726853256.18810: Calling all_inventory to load vars for managed_node1 11044 1726853256.18813: Calling groups_inventory to load vars for managed_node1 11044 1726853256.18814: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853256.18823: Calling all_plugins_play to load vars for managed_node1 11044 1726853256.18826: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853256.18828: Calling groups_plugins_play to load vars for managed_node1 11044 1726853256.20314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853256.21916: done with get_vars() 11044 1726853256.21941: variable 'ansible_search_path' from source: unknown 11044 1726853256.21942: variable 'ansible_search_path' from source: unknown 11044 1726853256.21991: we have included files to process 11044 1726853256.21992: generating all_blocks data 11044 1726853256.21994: done generating all_blocks data 11044 1726853256.21999: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11044 1726853256.22001: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11044 1726853256.22003: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11044 1726853256.22917: done processing included file 11044 1726853256.22920: iterating over new_blocks loaded from include file 11044 1726853256.22921: in VariableManager get_vars() 11044 1726853256.22943: done with get_vars() 11044 1726853256.22948: filtering new block on tags 11044 1726853256.22980: done filtering new block on tags 11044 1726853256.22983: in VariableManager get_vars() 11044 1726853256.23004: done with get_vars() 11044 1726853256.23006: filtering new block on tags 11044 1726853256.23027: done filtering new block on tags 11044 1726853256.23029: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 11044 1726853256.23035: extending task lists for all hosts with included blocks 11044 1726853256.23200: done extending task lists 11044 1726853256.23201: done processing included files 11044 1726853256.23202: results queue empty 11044 1726853256.23203: checking for any_errors_fatal 11044 1726853256.23206: done checking for any_errors_fatal 11044 1726853256.23207: checking for max_fail_percentage 11044 1726853256.23208: done checking for max_fail_percentage 11044 1726853256.23209: checking to see if all hosts have failed and the running result is not ok 11044 1726853256.23210: done checking to see if all hosts have failed 11044 1726853256.23210: getting the remaining hosts for this loop 11044 1726853256.23212: done getting the remaining hosts for this loop 11044 1726853256.23214: getting the next task for host managed_node1 11044 1726853256.23218: done getting next task for host managed_node1 11044 1726853256.23221: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11044 1726853256.23224: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853256.23226: getting variables 11044 1726853256.23227: in VariableManager get_vars() 11044 1726853256.23242: Calling all_inventory to load vars for managed_node1 11044 1726853256.23248: Calling groups_inventory to load vars for managed_node1 11044 1726853256.23250: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853256.23257: Calling all_plugins_play to load vars for managed_node1 11044 1726853256.23259: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853256.23262: Calling groups_plugins_play to load vars for managed_node1 11044 1726853256.24534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853256.26214: done with get_vars() 11044 1726853256.26238: done getting variables 11044 1726853256.26288: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:27:36 -0400 (0:00:00.090) 0:00:20.638 ****** 11044 1726853256.26326: entering _queue_task() for managed_node1/set_fact 11044 1726853256.26703: worker is 1 (out of 1 available) 11044 1726853256.26715: exiting _queue_task() for managed_node1/set_fact 11044 1726853256.26729: done queuing things up, now waiting for results queue to drain 11044 1726853256.26730: waiting for pending results... 11044 1726853256.27098: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 11044 1726853256.27178: in run() - task 02083763-bbaf-c5a6-f857-000000000443 11044 1726853256.27189: variable 'ansible_search_path' from source: unknown 11044 1726853256.27193: variable 'ansible_search_path' from source: unknown 11044 1726853256.27228: calling self._execute() 11044 1726853256.27481: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.27485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.27488: variable 'omit' from source: magic vars 11044 1726853256.28026: variable 'ansible_distribution_major_version' from source: facts 11044 1726853256.28046: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853256.28059: variable 'omit' from source: magic vars 11044 1726853256.28111: variable 'omit' from source: magic vars 11044 1726853256.28156: variable 'omit' from source: magic vars 11044 1726853256.28201: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853256.28252: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853256.28282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853256.28308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853256.28326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853256.28374: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853256.28385: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.28393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.28511: Set connection var ansible_timeout to 10 11044 1726853256.28574: Set connection var ansible_shell_executable to /bin/sh 11044 1726853256.28577: Set connection var ansible_shell_type to sh 11044 1726853256.28579: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853256.28582: Set connection var ansible_connection to ssh 11044 1726853256.28584: Set connection var ansible_pipelining to False 11044 1726853256.28593: variable 'ansible_shell_executable' from source: unknown 11044 1726853256.28601: variable 'ansible_connection' from source: unknown 11044 1726853256.28609: variable 'ansible_module_compression' from source: unknown 11044 1726853256.28616: variable 'ansible_shell_type' from source: unknown 11044 1726853256.28624: variable 'ansible_shell_executable' from source: unknown 11044 1726853256.28631: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.28681: variable 'ansible_pipelining' from source: unknown 11044 1726853256.28684: variable 'ansible_timeout' from source: unknown 11044 1726853256.28687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.28817: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853256.28834: variable 'omit' from source: magic vars 11044 1726853256.28847: starting attempt loop 11044 1726853256.28855: running the handler 11044 1726853256.28875: handler run complete 11044 1726853256.29076: attempt loop complete, returning result 11044 1726853256.29080: _execute() done 11044 1726853256.29082: dumping result to json 11044 1726853256.29085: done dumping result, returning 11044 1726853256.29087: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-c5a6-f857-000000000443] 11044 1726853256.29090: sending task result for task 02083763-bbaf-c5a6-f857-000000000443 11044 1726853256.29162: done sending task result for task 02083763-bbaf-c5a6-f857-000000000443 11044 1726853256.29166: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11044 1726853256.29224: no more pending results, returning what we have 11044 1726853256.29228: results queue empty 11044 1726853256.29229: checking for any_errors_fatal 11044 1726853256.29231: done checking for any_errors_fatal 11044 1726853256.29232: checking for max_fail_percentage 11044 1726853256.29234: done checking for max_fail_percentage 11044 1726853256.29235: checking to see if all hosts have failed and the running result is not ok 11044 1726853256.29236: done checking to see if all hosts have failed 11044 1726853256.29236: getting the remaining hosts for this loop 11044 1726853256.29238: done getting the remaining hosts for this loop 11044 1726853256.29241: getting the next task for host managed_node1 11044 1726853256.29251: done getting next task for host managed_node1 11044 1726853256.29254: ^ task is: TASK: Stat profile file 11044 1726853256.29259: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853256.29263: getting variables 11044 1726853256.29264: in VariableManager get_vars() 11044 1726853256.29311: Calling all_inventory to load vars for managed_node1 11044 1726853256.29315: Calling groups_inventory to load vars for managed_node1 11044 1726853256.29317: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853256.29329: Calling all_plugins_play to load vars for managed_node1 11044 1726853256.29333: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853256.29337: Calling groups_plugins_play to load vars for managed_node1 11044 1726853256.30885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853256.32521: done with get_vars() 11044 1726853256.32551: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:27:36 -0400 (0:00:00.063) 0:00:20.702 ****** 11044 1726853256.32659: entering _queue_task() for managed_node1/stat 11044 1726853256.33027: worker is 1 (out of 1 available) 11044 1726853256.33040: exiting _queue_task() for managed_node1/stat 11044 1726853256.33061: done queuing things up, now waiting for results queue to drain 11044 1726853256.33063: waiting for pending results... 11044 1726853256.33400: running TaskExecutor() for managed_node1/TASK: Stat profile file 11044 1726853256.33478: in run() - task 02083763-bbaf-c5a6-f857-000000000444 11044 1726853256.33483: variable 'ansible_search_path' from source: unknown 11044 1726853256.33486: variable 'ansible_search_path' from source: unknown 11044 1726853256.33495: calling self._execute() 11044 1726853256.33595: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.33606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.33612: variable 'omit' from source: magic vars 11044 1726853256.33993: variable 'ansible_distribution_major_version' from source: facts 11044 1726853256.34042: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853256.34048: variable 'omit' from source: magic vars 11044 1726853256.34063: variable 'omit' from source: magic vars 11044 1726853256.34157: variable 'profile' from source: include params 11044 1726853256.34166: variable 'item' from source: include params 11044 1726853256.34228: variable 'item' from source: include params 11044 1726853256.34263: variable 'omit' from source: magic vars 11044 1726853256.34290: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853256.34373: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853256.34377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853256.34379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853256.34381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853256.34407: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853256.34410: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.34413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.34519: Set connection var ansible_timeout to 10 11044 1726853256.34529: Set connection var ansible_shell_executable to /bin/sh 11044 1726853256.34532: Set connection var ansible_shell_type to sh 11044 1726853256.34576: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853256.34579: Set connection var ansible_connection to ssh 11044 1726853256.34589: Set connection var ansible_pipelining to False 11044 1726853256.34592: variable 'ansible_shell_executable' from source: unknown 11044 1726853256.34594: variable 'ansible_connection' from source: unknown 11044 1726853256.34597: variable 'ansible_module_compression' from source: unknown 11044 1726853256.34599: variable 'ansible_shell_type' from source: unknown 11044 1726853256.34601: variable 'ansible_shell_executable' from source: unknown 11044 1726853256.34603: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.34605: variable 'ansible_pipelining' from source: unknown 11044 1726853256.34607: variable 'ansible_timeout' from source: unknown 11044 1726853256.34609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.34792: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11044 1726853256.34804: variable 'omit' from source: magic vars 11044 1726853256.34807: starting attempt loop 11044 1726853256.34810: running the handler 11044 1726853256.34824: _low_level_execute_command(): starting 11044 1726853256.34832: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853256.35564: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11044 1726853256.35587: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853256.35677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853256.35681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853256.35749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853256.37448: stdout chunk (state=3): >>>/root <<< 11044 1726853256.37538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853256.37659: stderr chunk (state=3): >>><<< 11044 1726853256.37663: stdout chunk (state=3): >>><<< 11044 1726853256.37796: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853256.37801: _low_level_execute_command(): starting 11044 1726853256.37804: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853256.3769355-12109-116981106254153 `" && echo ansible-tmp-1726853256.3769355-12109-116981106254153="` echo /root/.ansible/tmp/ansible-tmp-1726853256.3769355-12109-116981106254153 `" ) && sleep 0' 11044 1726853256.38477: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853256.38549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853256.38611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853256.38648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853256.40550: stdout chunk (state=3): >>>ansible-tmp-1726853256.3769355-12109-116981106254153=/root/.ansible/tmp/ansible-tmp-1726853256.3769355-12109-116981106254153 <<< 11044 1726853256.40734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853256.40737: stdout chunk (state=3): >>><<< 11044 1726853256.40740: stderr chunk (state=3): >>><<< 11044 1726853256.40763: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853256.3769355-12109-116981106254153=/root/.ansible/tmp/ansible-tmp-1726853256.3769355-12109-116981106254153 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853256.40982: variable 'ansible_module_compression' from source: unknown 11044 1726853256.40985: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11044 1726853256.40988: variable 'ansible_facts' from source: unknown 11044 1726853256.41022: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853256.3769355-12109-116981106254153/AnsiballZ_stat.py 11044 1726853256.41201: Sending initial data 11044 1726853256.41212: Sent initial data (153 bytes) 11044 1726853256.41866: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853256.41975: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853256.41996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853256.42009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853256.42085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853256.43610: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853256.43701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853256.43746: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpwv65yvxh /root/.ansible/tmp/ansible-tmp-1726853256.3769355-12109-116981106254153/AnsiballZ_stat.py <<< 11044 1726853256.43769: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853256.3769355-12109-116981106254153/AnsiballZ_stat.py" <<< 11044 1726853256.43793: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpwv65yvxh" to remote "/root/.ansible/tmp/ansible-tmp-1726853256.3769355-12109-116981106254153/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853256.3769355-12109-116981106254153/AnsiballZ_stat.py" <<< 11044 1726853256.44585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853256.44647: stderr chunk (state=3): >>><<< 11044 1726853256.44652: stdout chunk (state=3): >>><<< 11044 1726853256.44668: done transferring module to remote 11044 1726853256.44680: _low_level_execute_command(): starting 11044 1726853256.44683: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853256.3769355-12109-116981106254153/ /root/.ansible/tmp/ansible-tmp-1726853256.3769355-12109-116981106254153/AnsiballZ_stat.py && sleep 0' 11044 1726853256.45102: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853256.45126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 11044 1726853256.45129: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853256.45134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853256.45184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853256.45187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853256.45233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853256.47000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853256.47003: stdout chunk (state=3): >>><<< 11044 1726853256.47011: stderr chunk (state=3): >>><<< 11044 1726853256.47138: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853256.47142: _low_level_execute_command(): starting 11044 1726853256.47151: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853256.3769355-12109-116981106254153/AnsiballZ_stat.py && sleep 0' 11044 1726853256.47513: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853256.47537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11044 1726853256.47541: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853256.47547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853256.47586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853256.47598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853256.47644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853256.62824: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11044 1726853256.64189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853256.64194: stdout chunk (state=3): >>><<< 11044 1726853256.64196: stderr chunk (state=3): >>><<< 11044 1726853256.64210: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853256.64235: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853256.3769355-12109-116981106254153/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853256.64247: _low_level_execute_command(): starting 11044 1726853256.64250: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853256.3769355-12109-116981106254153/ > /dev/null 2>&1 && sleep 0' 11044 1726853256.64683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853256.64687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853256.64690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853256.64692: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853256.64694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853256.64745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853256.64752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853256.64754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853256.64793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853256.66617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853256.66649: stderr chunk (state=3): >>><<< 11044 1726853256.66652: stdout chunk (state=3): >>><<< 11044 1726853256.66665: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853256.66673: handler run complete 11044 1726853256.66690: attempt loop complete, returning result 11044 1726853256.66693: _execute() done 11044 1726853256.66695: dumping result to json 11044 1726853256.66698: done dumping result, returning 11044 1726853256.66705: done running TaskExecutor() for managed_node1/TASK: Stat profile file [02083763-bbaf-c5a6-f857-000000000444] 11044 1726853256.66708: sending task result for task 02083763-bbaf-c5a6-f857-000000000444 11044 1726853256.66803: done sending task result for task 02083763-bbaf-c5a6-f857-000000000444 11044 1726853256.66806: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 11044 1726853256.66894: no more pending results, returning what we have 11044 1726853256.66897: results queue empty 11044 1726853256.66898: checking for any_errors_fatal 11044 1726853256.66905: done checking for any_errors_fatal 11044 1726853256.66905: checking for max_fail_percentage 11044 1726853256.66907: done checking for max_fail_percentage 11044 1726853256.66908: checking to see if all hosts have failed and the running result is not ok 11044 1726853256.66908: done checking to see if all hosts have failed 11044 1726853256.66909: getting the remaining hosts for this loop 11044 1726853256.66911: done getting the remaining hosts for this loop 11044 1726853256.66914: getting the next task for host managed_node1 11044 1726853256.66921: done getting next task for host managed_node1 11044 1726853256.66923: ^ task is: TASK: Set NM profile exist flag based on the profile files 11044 1726853256.66927: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853256.66930: getting variables 11044 1726853256.66931: in VariableManager get_vars() 11044 1726853256.66972: Calling all_inventory to load vars for managed_node1 11044 1726853256.66975: Calling groups_inventory to load vars for managed_node1 11044 1726853256.66977: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853256.66987: Calling all_plugins_play to load vars for managed_node1 11044 1726853256.66990: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853256.66992: Calling groups_plugins_play to load vars for managed_node1 11044 1726853256.67890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853256.68735: done with get_vars() 11044 1726853256.68751: done getting variables 11044 1726853256.68795: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:27:36 -0400 (0:00:00.361) 0:00:21.063 ****** 11044 1726853256.68820: entering _queue_task() for managed_node1/set_fact 11044 1726853256.69062: worker is 1 (out of 1 available) 11044 1726853256.69078: exiting _queue_task() for managed_node1/set_fact 11044 1726853256.69092: done queuing things up, now waiting for results queue to drain 11044 1726853256.69093: waiting for pending results... 11044 1726853256.69275: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 11044 1726853256.69352: in run() - task 02083763-bbaf-c5a6-f857-000000000445 11044 1726853256.69368: variable 'ansible_search_path' from source: unknown 11044 1726853256.69373: variable 'ansible_search_path' from source: unknown 11044 1726853256.69401: calling self._execute() 11044 1726853256.69484: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.69488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.69497: variable 'omit' from source: magic vars 11044 1726853256.69780: variable 'ansible_distribution_major_version' from source: facts 11044 1726853256.69790: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853256.69874: variable 'profile_stat' from source: set_fact 11044 1726853256.69885: Evaluated conditional (profile_stat.stat.exists): False 11044 1726853256.69888: when evaluation is False, skipping this task 11044 1726853256.69890: _execute() done 11044 1726853256.69893: dumping result to json 11044 1726853256.69897: done dumping result, returning 11044 1726853256.69903: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-c5a6-f857-000000000445] 11044 1726853256.69908: sending task result for task 02083763-bbaf-c5a6-f857-000000000445 11044 1726853256.69991: done sending task result for task 02083763-bbaf-c5a6-f857-000000000445 11044 1726853256.69994: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11044 1726853256.70037: no more pending results, returning what we have 11044 1726853256.70041: results queue empty 11044 1726853256.70042: checking for any_errors_fatal 11044 1726853256.70052: done checking for any_errors_fatal 11044 1726853256.70053: checking for max_fail_percentage 11044 1726853256.70054: done checking for max_fail_percentage 11044 1726853256.70055: checking to see if all hosts have failed and the running result is not ok 11044 1726853256.70056: done checking to see if all hosts have failed 11044 1726853256.70056: getting the remaining hosts for this loop 11044 1726853256.70058: done getting the remaining hosts for this loop 11044 1726853256.70061: getting the next task for host managed_node1 11044 1726853256.70066: done getting next task for host managed_node1 11044 1726853256.70069: ^ task is: TASK: Get NM profile info 11044 1726853256.70075: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853256.70079: getting variables 11044 1726853256.70080: in VariableManager get_vars() 11044 1726853256.70120: Calling all_inventory to load vars for managed_node1 11044 1726853256.70122: Calling groups_inventory to load vars for managed_node1 11044 1726853256.70124: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853256.70134: Calling all_plugins_play to load vars for managed_node1 11044 1726853256.70137: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853256.70139: Calling groups_plugins_play to load vars for managed_node1 11044 1726853256.70904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853256.71753: done with get_vars() 11044 1726853256.71769: done getting variables 11044 1726853256.71816: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:27:36 -0400 (0:00:00.030) 0:00:21.094 ****** 11044 1726853256.71839: entering _queue_task() for managed_node1/shell 11044 1726853256.72083: worker is 1 (out of 1 available) 11044 1726853256.72099: exiting _queue_task() for managed_node1/shell 11044 1726853256.72112: done queuing things up, now waiting for results queue to drain 11044 1726853256.72113: waiting for pending results... 11044 1726853256.72290: running TaskExecutor() for managed_node1/TASK: Get NM profile info 11044 1726853256.72370: in run() - task 02083763-bbaf-c5a6-f857-000000000446 11044 1726853256.72384: variable 'ansible_search_path' from source: unknown 11044 1726853256.72388: variable 'ansible_search_path' from source: unknown 11044 1726853256.72415: calling self._execute() 11044 1726853256.72508: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.72512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.72535: variable 'omit' from source: magic vars 11044 1726853256.72800: variable 'ansible_distribution_major_version' from source: facts 11044 1726853256.72809: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853256.72815: variable 'omit' from source: magic vars 11044 1726853256.72845: variable 'omit' from source: magic vars 11044 1726853256.72916: variable 'profile' from source: include params 11044 1726853256.72919: variable 'item' from source: include params 11044 1726853256.72965: variable 'item' from source: include params 11044 1726853256.72981: variable 'omit' from source: magic vars 11044 1726853256.73018: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853256.73044: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853256.73063: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853256.73077: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853256.73087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853256.73113: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853256.73116: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.73119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.73187: Set connection var ansible_timeout to 10 11044 1726853256.73194: Set connection var ansible_shell_executable to /bin/sh 11044 1726853256.73197: Set connection var ansible_shell_type to sh 11044 1726853256.73204: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853256.73206: Set connection var ansible_connection to ssh 11044 1726853256.73216: Set connection var ansible_pipelining to False 11044 1726853256.73230: variable 'ansible_shell_executable' from source: unknown 11044 1726853256.73233: variable 'ansible_connection' from source: unknown 11044 1726853256.73235: variable 'ansible_module_compression' from source: unknown 11044 1726853256.73237: variable 'ansible_shell_type' from source: unknown 11044 1726853256.73239: variable 'ansible_shell_executable' from source: unknown 11044 1726853256.73241: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853256.73248: variable 'ansible_pipelining' from source: unknown 11044 1726853256.73251: variable 'ansible_timeout' from source: unknown 11044 1726853256.73254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853256.73356: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853256.73364: variable 'omit' from source: magic vars 11044 1726853256.73368: starting attempt loop 11044 1726853256.73373: running the handler 11044 1726853256.73381: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853256.73397: _low_level_execute_command(): starting 11044 1726853256.73403: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853256.73921: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853256.73927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853256.73931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853256.73977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853256.73982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853256.74000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853256.74038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853256.75645: stdout chunk (state=3): >>>/root <<< 11044 1726853256.75745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853256.75777: stderr chunk (state=3): >>><<< 11044 1726853256.75780: stdout chunk (state=3): >>><<< 11044 1726853256.75800: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853256.75811: _low_level_execute_command(): starting 11044 1726853256.75818: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853256.7579918-12127-126239867256955 `" && echo ansible-tmp-1726853256.7579918-12127-126239867256955="` echo /root/.ansible/tmp/ansible-tmp-1726853256.7579918-12127-126239867256955 `" ) && sleep 0' 11044 1726853256.76232: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853256.76246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853256.76266: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853256.76269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853256.76317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853256.76321: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853256.76331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853256.76387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853256.78267: stdout chunk (state=3): >>>ansible-tmp-1726853256.7579918-12127-126239867256955=/root/.ansible/tmp/ansible-tmp-1726853256.7579918-12127-126239867256955 <<< 11044 1726853256.78381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853256.78406: stderr chunk (state=3): >>><<< 11044 1726853256.78409: stdout chunk (state=3): >>><<< 11044 1726853256.78427: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853256.7579918-12127-126239867256955=/root/.ansible/tmp/ansible-tmp-1726853256.7579918-12127-126239867256955 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853256.78456: variable 'ansible_module_compression' from source: unknown 11044 1726853256.78497: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11044 1726853256.78529: variable 'ansible_facts' from source: unknown 11044 1726853256.78581: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853256.7579918-12127-126239867256955/AnsiballZ_command.py 11044 1726853256.78685: Sending initial data 11044 1726853256.78689: Sent initial data (156 bytes) 11044 1726853256.79125: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853256.79128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853256.79130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853256.79132: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853256.79134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853256.79177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853256.79198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853256.79231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853256.80777: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11044 1726853256.80784: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853256.80810: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853256.80843: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmph0pyyv7g /root/.ansible/tmp/ansible-tmp-1726853256.7579918-12127-126239867256955/AnsiballZ_command.py <<< 11044 1726853256.80847: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853256.7579918-12127-126239867256955/AnsiballZ_command.py" <<< 11044 1726853256.80887: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmph0pyyv7g" to remote "/root/.ansible/tmp/ansible-tmp-1726853256.7579918-12127-126239867256955/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853256.7579918-12127-126239867256955/AnsiballZ_command.py" <<< 11044 1726853256.81407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853256.81446: stderr chunk (state=3): >>><<< 11044 1726853256.81451: stdout chunk (state=3): >>><<< 11044 1726853256.81468: done transferring module to remote 11044 1726853256.81479: _low_level_execute_command(): starting 11044 1726853256.81483: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853256.7579918-12127-126239867256955/ /root/.ansible/tmp/ansible-tmp-1726853256.7579918-12127-126239867256955/AnsiballZ_command.py && sleep 0' 11044 1726853256.81896: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853256.81899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853256.81905: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853256.81908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853256.81951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853256.81954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853256.82002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853256.83863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853256.83866: stdout chunk (state=3): >>><<< 11044 1726853256.83868: stderr chunk (state=3): >>><<< 11044 1726853256.83873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853256.83875: _low_level_execute_command(): starting 11044 1726853256.83877: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853256.7579918-12127-126239867256955/AnsiballZ_command.py && sleep 0' 11044 1726853256.84406: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853256.84420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853256.84490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853256.84552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853256.84567: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853256.84593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853256.84667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853257.01619: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 13:27:36.995198", "end": "2024-09-20 13:27:37.015603", "delta": "0:00:00.020405", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11044 1726853257.03121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853257.03142: stderr chunk (state=3): >>><<< 11044 1726853257.03146: stdout chunk (state=3): >>><<< 11044 1726853257.03168: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 13:27:36.995198", "end": "2024-09-20 13:27:37.015603", "delta": "0:00:00.020405", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853257.03198: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853256.7579918-12127-126239867256955/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853257.03206: _low_level_execute_command(): starting 11044 1726853257.03212: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853256.7579918-12127-126239867256955/ > /dev/null 2>&1 && sleep 0' 11044 1726853257.03658: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853257.03662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853257.03664: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853257.03666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853257.03669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853257.03715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853257.03720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853257.03722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853257.03765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853257.05578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853257.05603: stderr chunk (state=3): >>><<< 11044 1726853257.05606: stdout chunk (state=3): >>><<< 11044 1726853257.05619: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853257.05625: handler run complete 11044 1726853257.05648: Evaluated conditional (False): False 11044 1726853257.05655: attempt loop complete, returning result 11044 1726853257.05658: _execute() done 11044 1726853257.05660: dumping result to json 11044 1726853257.05665: done dumping result, returning 11044 1726853257.05674: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [02083763-bbaf-c5a6-f857-000000000446] 11044 1726853257.05676: sending task result for task 02083763-bbaf-c5a6-f857-000000000446 11044 1726853257.05776: done sending task result for task 02083763-bbaf-c5a6-f857-000000000446 11044 1726853257.05779: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.020405", "end": "2024-09-20 13:27:37.015603", "rc": 0, "start": "2024-09-20 13:27:36.995198" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 11044 1726853257.05865: no more pending results, returning what we have 11044 1726853257.05868: results queue empty 11044 1726853257.05869: checking for any_errors_fatal 11044 1726853257.05878: done checking for any_errors_fatal 11044 1726853257.05879: checking for max_fail_percentage 11044 1726853257.05881: done checking for max_fail_percentage 11044 1726853257.05881: checking to see if all hosts have failed and the running result is not ok 11044 1726853257.05882: done checking to see if all hosts have failed 11044 1726853257.05883: getting the remaining hosts for this loop 11044 1726853257.05884: done getting the remaining hosts for this loop 11044 1726853257.05888: getting the next task for host managed_node1 11044 1726853257.05894: done getting next task for host managed_node1 11044 1726853257.05896: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11044 1726853257.05899: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853257.05903: getting variables 11044 1726853257.05905: in VariableManager get_vars() 11044 1726853257.05943: Calling all_inventory to load vars for managed_node1 11044 1726853257.05948: Calling groups_inventory to load vars for managed_node1 11044 1726853257.05950: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853257.05960: Calling all_plugins_play to load vars for managed_node1 11044 1726853257.05962: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853257.05964: Calling groups_plugins_play to load vars for managed_node1 11044 1726853257.06857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853257.07709: done with get_vars() 11044 1726853257.07725: done getting variables 11044 1726853257.07770: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:27:37 -0400 (0:00:00.359) 0:00:21.453 ****** 11044 1726853257.07793: entering _queue_task() for managed_node1/set_fact 11044 1726853257.08049: worker is 1 (out of 1 available) 11044 1726853257.08062: exiting _queue_task() for managed_node1/set_fact 11044 1726853257.08077: done queuing things up, now waiting for results queue to drain 11044 1726853257.08079: waiting for pending results... 11044 1726853257.08252: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11044 1726853257.08325: in run() - task 02083763-bbaf-c5a6-f857-000000000447 11044 1726853257.08337: variable 'ansible_search_path' from source: unknown 11044 1726853257.08341: variable 'ansible_search_path' from source: unknown 11044 1726853257.08370: calling self._execute() 11044 1726853257.08451: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.08454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.08463: variable 'omit' from source: magic vars 11044 1726853257.08729: variable 'ansible_distribution_major_version' from source: facts 11044 1726853257.08741: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853257.08825: variable 'nm_profile_exists' from source: set_fact 11044 1726853257.08836: Evaluated conditional (nm_profile_exists.rc == 0): True 11044 1726853257.08843: variable 'omit' from source: magic vars 11044 1726853257.08876: variable 'omit' from source: magic vars 11044 1726853257.08898: variable 'omit' from source: magic vars 11044 1726853257.08931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853257.08959: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853257.08979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853257.08993: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853257.09003: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853257.09028: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853257.09031: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.09034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.09105: Set connection var ansible_timeout to 10 11044 1726853257.09112: Set connection var ansible_shell_executable to /bin/sh 11044 1726853257.09115: Set connection var ansible_shell_type to sh 11044 1726853257.09119: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853257.09124: Set connection var ansible_connection to ssh 11044 1726853257.09129: Set connection var ansible_pipelining to False 11044 1726853257.09149: variable 'ansible_shell_executable' from source: unknown 11044 1726853257.09153: variable 'ansible_connection' from source: unknown 11044 1726853257.09155: variable 'ansible_module_compression' from source: unknown 11044 1726853257.09157: variable 'ansible_shell_type' from source: unknown 11044 1726853257.09160: variable 'ansible_shell_executable' from source: unknown 11044 1726853257.09162: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.09164: variable 'ansible_pipelining' from source: unknown 11044 1726853257.09166: variable 'ansible_timeout' from source: unknown 11044 1726853257.09168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.09269: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853257.09279: variable 'omit' from source: magic vars 11044 1726853257.09284: starting attempt loop 11044 1726853257.09286: running the handler 11044 1726853257.09299: handler run complete 11044 1726853257.09308: attempt loop complete, returning result 11044 1726853257.09310: _execute() done 11044 1726853257.09313: dumping result to json 11044 1726853257.09315: done dumping result, returning 11044 1726853257.09322: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-c5a6-f857-000000000447] 11044 1726853257.09325: sending task result for task 02083763-bbaf-c5a6-f857-000000000447 11044 1726853257.09403: done sending task result for task 02083763-bbaf-c5a6-f857-000000000447 11044 1726853257.09407: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11044 1726853257.09458: no more pending results, returning what we have 11044 1726853257.09461: results queue empty 11044 1726853257.09462: checking for any_errors_fatal 11044 1726853257.09470: done checking for any_errors_fatal 11044 1726853257.09472: checking for max_fail_percentage 11044 1726853257.09474: done checking for max_fail_percentage 11044 1726853257.09475: checking to see if all hosts have failed and the running result is not ok 11044 1726853257.09476: done checking to see if all hosts have failed 11044 1726853257.09476: getting the remaining hosts for this loop 11044 1726853257.09477: done getting the remaining hosts for this loop 11044 1726853257.09481: getting the next task for host managed_node1 11044 1726853257.09489: done getting next task for host managed_node1 11044 1726853257.09491: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11044 1726853257.09495: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853257.09499: getting variables 11044 1726853257.09501: in VariableManager get_vars() 11044 1726853257.09540: Calling all_inventory to load vars for managed_node1 11044 1726853257.09543: Calling groups_inventory to load vars for managed_node1 11044 1726853257.09548: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853257.09558: Calling all_plugins_play to load vars for managed_node1 11044 1726853257.09560: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853257.09562: Calling groups_plugins_play to load vars for managed_node1 11044 1726853257.10338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853257.11282: done with get_vars() 11044 1726853257.11297: done getting variables 11044 1726853257.11339: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853257.11422: variable 'profile' from source: include params 11044 1726853257.11426: variable 'item' from source: include params 11044 1726853257.11467: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:27:37 -0400 (0:00:00.036) 0:00:21.490 ****** 11044 1726853257.11495: entering _queue_task() for managed_node1/command 11044 1726853257.11730: worker is 1 (out of 1 available) 11044 1726853257.11742: exiting _queue_task() for managed_node1/command 11044 1726853257.11756: done queuing things up, now waiting for results queue to drain 11044 1726853257.11758: waiting for pending results... 11044 1726853257.11930: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 11044 1726853257.12009: in run() - task 02083763-bbaf-c5a6-f857-000000000449 11044 1726853257.12020: variable 'ansible_search_path' from source: unknown 11044 1726853257.12024: variable 'ansible_search_path' from source: unknown 11044 1726853257.12051: calling self._execute() 11044 1726853257.12126: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.12129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.12139: variable 'omit' from source: magic vars 11044 1726853257.12401: variable 'ansible_distribution_major_version' from source: facts 11044 1726853257.12411: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853257.12495: variable 'profile_stat' from source: set_fact 11044 1726853257.12506: Evaluated conditional (profile_stat.stat.exists): False 11044 1726853257.12509: when evaluation is False, skipping this task 11044 1726853257.12511: _execute() done 11044 1726853257.12514: dumping result to json 11044 1726853257.12516: done dumping result, returning 11044 1726853257.12525: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [02083763-bbaf-c5a6-f857-000000000449] 11044 1726853257.12527: sending task result for task 02083763-bbaf-c5a6-f857-000000000449 11044 1726853257.12635: done sending task result for task 02083763-bbaf-c5a6-f857-000000000449 11044 1726853257.12639: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11044 1726853257.12691: no more pending results, returning what we have 11044 1726853257.12695: results queue empty 11044 1726853257.12696: checking for any_errors_fatal 11044 1726853257.12701: done checking for any_errors_fatal 11044 1726853257.12702: checking for max_fail_percentage 11044 1726853257.12703: done checking for max_fail_percentage 11044 1726853257.12704: checking to see if all hosts have failed and the running result is not ok 11044 1726853257.12705: done checking to see if all hosts have failed 11044 1726853257.12706: getting the remaining hosts for this loop 11044 1726853257.12707: done getting the remaining hosts for this loop 11044 1726853257.12710: getting the next task for host managed_node1 11044 1726853257.12716: done getting next task for host managed_node1 11044 1726853257.12718: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11044 1726853257.12721: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853257.12724: getting variables 11044 1726853257.12725: in VariableManager get_vars() 11044 1726853257.12762: Calling all_inventory to load vars for managed_node1 11044 1726853257.12765: Calling groups_inventory to load vars for managed_node1 11044 1726853257.12767: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853257.12778: Calling all_plugins_play to load vars for managed_node1 11044 1726853257.12781: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853257.12783: Calling groups_plugins_play to load vars for managed_node1 11044 1726853257.13527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853257.14384: done with get_vars() 11044 1726853257.14400: done getting variables 11044 1726853257.14441: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853257.14520: variable 'profile' from source: include params 11044 1726853257.14522: variable 'item' from source: include params 11044 1726853257.14564: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:27:37 -0400 (0:00:00.030) 0:00:21.521 ****** 11044 1726853257.14587: entering _queue_task() for managed_node1/set_fact 11044 1726853257.14822: worker is 1 (out of 1 available) 11044 1726853257.14836: exiting _queue_task() for managed_node1/set_fact 11044 1726853257.14851: done queuing things up, now waiting for results queue to drain 11044 1726853257.14853: waiting for pending results... 11044 1726853257.15027: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 11044 1726853257.15107: in run() - task 02083763-bbaf-c5a6-f857-00000000044a 11044 1726853257.15120: variable 'ansible_search_path' from source: unknown 11044 1726853257.15124: variable 'ansible_search_path' from source: unknown 11044 1726853257.15153: calling self._execute() 11044 1726853257.15233: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.15236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.15245: variable 'omit' from source: magic vars 11044 1726853257.15510: variable 'ansible_distribution_major_version' from source: facts 11044 1726853257.15522: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853257.15602: variable 'profile_stat' from source: set_fact 11044 1726853257.15614: Evaluated conditional (profile_stat.stat.exists): False 11044 1726853257.15617: when evaluation is False, skipping this task 11044 1726853257.15622: _execute() done 11044 1726853257.15625: dumping result to json 11044 1726853257.15627: done dumping result, returning 11044 1726853257.15637: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [02083763-bbaf-c5a6-f857-00000000044a] 11044 1726853257.15640: sending task result for task 02083763-bbaf-c5a6-f857-00000000044a 11044 1726853257.15719: done sending task result for task 02083763-bbaf-c5a6-f857-00000000044a 11044 1726853257.15722: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11044 1726853257.15782: no more pending results, returning what we have 11044 1726853257.15785: results queue empty 11044 1726853257.15786: checking for any_errors_fatal 11044 1726853257.15792: done checking for any_errors_fatal 11044 1726853257.15793: checking for max_fail_percentage 11044 1726853257.15794: done checking for max_fail_percentage 11044 1726853257.15795: checking to see if all hosts have failed and the running result is not ok 11044 1726853257.15795: done checking to see if all hosts have failed 11044 1726853257.15796: getting the remaining hosts for this loop 11044 1726853257.15798: done getting the remaining hosts for this loop 11044 1726853257.15801: getting the next task for host managed_node1 11044 1726853257.15806: done getting next task for host managed_node1 11044 1726853257.15809: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11044 1726853257.15812: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853257.15816: getting variables 11044 1726853257.15817: in VariableManager get_vars() 11044 1726853257.15852: Calling all_inventory to load vars for managed_node1 11044 1726853257.15855: Calling groups_inventory to load vars for managed_node1 11044 1726853257.15857: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853257.15866: Calling all_plugins_play to load vars for managed_node1 11044 1726853257.15869: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853257.15873: Calling groups_plugins_play to load vars for managed_node1 11044 1726853257.16715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853257.20567: done with get_vars() 11044 1726853257.20586: done getting variables 11044 1726853257.20625: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853257.20691: variable 'profile' from source: include params 11044 1726853257.20693: variable 'item' from source: include params 11044 1726853257.20732: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:27:37 -0400 (0:00:00.061) 0:00:21.583 ****** 11044 1726853257.20752: entering _queue_task() for managed_node1/command 11044 1726853257.21016: worker is 1 (out of 1 available) 11044 1726853257.21031: exiting _queue_task() for managed_node1/command 11044 1726853257.21044: done queuing things up, now waiting for results queue to drain 11044 1726853257.21045: waiting for pending results... 11044 1726853257.21226: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 11044 1726853257.21317: in run() - task 02083763-bbaf-c5a6-f857-00000000044b 11044 1726853257.21327: variable 'ansible_search_path' from source: unknown 11044 1726853257.21331: variable 'ansible_search_path' from source: unknown 11044 1726853257.21363: calling self._execute() 11044 1726853257.21443: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.21451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.21459: variable 'omit' from source: magic vars 11044 1726853257.21743: variable 'ansible_distribution_major_version' from source: facts 11044 1726853257.21755: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853257.21842: variable 'profile_stat' from source: set_fact 11044 1726853257.21855: Evaluated conditional (profile_stat.stat.exists): False 11044 1726853257.21858: when evaluation is False, skipping this task 11044 1726853257.21862: _execute() done 11044 1726853257.21865: dumping result to json 11044 1726853257.21868: done dumping result, returning 11044 1726853257.21874: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 [02083763-bbaf-c5a6-f857-00000000044b] 11044 1726853257.21879: sending task result for task 02083763-bbaf-c5a6-f857-00000000044b 11044 1726853257.21961: done sending task result for task 02083763-bbaf-c5a6-f857-00000000044b 11044 1726853257.21964: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11044 1726853257.22009: no more pending results, returning what we have 11044 1726853257.22012: results queue empty 11044 1726853257.22013: checking for any_errors_fatal 11044 1726853257.22021: done checking for any_errors_fatal 11044 1726853257.22021: checking for max_fail_percentage 11044 1726853257.22023: done checking for max_fail_percentage 11044 1726853257.22024: checking to see if all hosts have failed and the running result is not ok 11044 1726853257.22024: done checking to see if all hosts have failed 11044 1726853257.22025: getting the remaining hosts for this loop 11044 1726853257.22026: done getting the remaining hosts for this loop 11044 1726853257.22030: getting the next task for host managed_node1 11044 1726853257.22036: done getting next task for host managed_node1 11044 1726853257.22039: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11044 1726853257.22042: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853257.22047: getting variables 11044 1726853257.22049: in VariableManager get_vars() 11044 1726853257.22091: Calling all_inventory to load vars for managed_node1 11044 1726853257.22093: Calling groups_inventory to load vars for managed_node1 11044 1726853257.22096: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853257.22108: Calling all_plugins_play to load vars for managed_node1 11044 1726853257.22110: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853257.22113: Calling groups_plugins_play to load vars for managed_node1 11044 1726853257.22878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853257.23732: done with get_vars() 11044 1726853257.23747: done getting variables 11044 1726853257.23790: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853257.23868: variable 'profile' from source: include params 11044 1726853257.23873: variable 'item' from source: include params 11044 1726853257.23913: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:27:37 -0400 (0:00:00.031) 0:00:21.615 ****** 11044 1726853257.23936: entering _queue_task() for managed_node1/set_fact 11044 1726853257.24167: worker is 1 (out of 1 available) 11044 1726853257.24182: exiting _queue_task() for managed_node1/set_fact 11044 1726853257.24197: done queuing things up, now waiting for results queue to drain 11044 1726853257.24198: waiting for pending results... 11044 1726853257.24375: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 11044 1726853257.24446: in run() - task 02083763-bbaf-c5a6-f857-00000000044c 11044 1726853257.24460: variable 'ansible_search_path' from source: unknown 11044 1726853257.24464: variable 'ansible_search_path' from source: unknown 11044 1726853257.24493: calling self._execute() 11044 1726853257.24570: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.24575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.24583: variable 'omit' from source: magic vars 11044 1726853257.24866: variable 'ansible_distribution_major_version' from source: facts 11044 1726853257.24870: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853257.24957: variable 'profile_stat' from source: set_fact 11044 1726853257.24977: Evaluated conditional (profile_stat.stat.exists): False 11044 1726853257.24980: when evaluation is False, skipping this task 11044 1726853257.24983: _execute() done 11044 1726853257.24986: dumping result to json 11044 1726853257.24988: done dumping result, returning 11044 1726853257.24991: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [02083763-bbaf-c5a6-f857-00000000044c] 11044 1726853257.24993: sending task result for task 02083763-bbaf-c5a6-f857-00000000044c 11044 1726853257.25073: done sending task result for task 02083763-bbaf-c5a6-f857-00000000044c 11044 1726853257.25076: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11044 1726853257.25120: no more pending results, returning what we have 11044 1726853257.25123: results queue empty 11044 1726853257.25124: checking for any_errors_fatal 11044 1726853257.25131: done checking for any_errors_fatal 11044 1726853257.25132: checking for max_fail_percentage 11044 1726853257.25134: done checking for max_fail_percentage 11044 1726853257.25135: checking to see if all hosts have failed and the running result is not ok 11044 1726853257.25135: done checking to see if all hosts have failed 11044 1726853257.25136: getting the remaining hosts for this loop 11044 1726853257.25137: done getting the remaining hosts for this loop 11044 1726853257.25141: getting the next task for host managed_node1 11044 1726853257.25148: done getting next task for host managed_node1 11044 1726853257.25151: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11044 1726853257.25154: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853257.25158: getting variables 11044 1726853257.25160: in VariableManager get_vars() 11044 1726853257.25202: Calling all_inventory to load vars for managed_node1 11044 1726853257.25204: Calling groups_inventory to load vars for managed_node1 11044 1726853257.25207: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853257.25216: Calling all_plugins_play to load vars for managed_node1 11044 1726853257.25219: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853257.25221: Calling groups_plugins_play to load vars for managed_node1 11044 1726853257.26084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853257.26925: done with get_vars() 11044 1726853257.26939: done getting variables 11044 1726853257.26982: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853257.27062: variable 'profile' from source: include params 11044 1726853257.27065: variable 'item' from source: include params 11044 1726853257.27104: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:27:37 -0400 (0:00:00.031) 0:00:21.647 ****** 11044 1726853257.27128: entering _queue_task() for managed_node1/assert 11044 1726853257.27355: worker is 1 (out of 1 available) 11044 1726853257.27369: exiting _queue_task() for managed_node1/assert 11044 1726853257.27383: done queuing things up, now waiting for results queue to drain 11044 1726853257.27385: waiting for pending results... 11044 1726853257.27567: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.1' 11044 1726853257.27635: in run() - task 02083763-bbaf-c5a6-f857-00000000026f 11044 1726853257.27649: variable 'ansible_search_path' from source: unknown 11044 1726853257.27652: variable 'ansible_search_path' from source: unknown 11044 1726853257.27685: calling self._execute() 11044 1726853257.27765: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.27773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.27782: variable 'omit' from source: magic vars 11044 1726853257.28056: variable 'ansible_distribution_major_version' from source: facts 11044 1726853257.28066: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853257.28073: variable 'omit' from source: magic vars 11044 1726853257.28102: variable 'omit' from source: magic vars 11044 1726853257.28173: variable 'profile' from source: include params 11044 1726853257.28177: variable 'item' from source: include params 11044 1726853257.28220: variable 'item' from source: include params 11044 1726853257.28235: variable 'omit' from source: magic vars 11044 1726853257.28268: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853257.28297: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853257.28313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853257.28327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853257.28337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853257.28365: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853257.28368: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.28372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.28436: Set connection var ansible_timeout to 10 11044 1726853257.28443: Set connection var ansible_shell_executable to /bin/sh 11044 1726853257.28449: Set connection var ansible_shell_type to sh 11044 1726853257.28452: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853257.28454: Set connection var ansible_connection to ssh 11044 1726853257.28460: Set connection var ansible_pipelining to False 11044 1726853257.28482: variable 'ansible_shell_executable' from source: unknown 11044 1726853257.28485: variable 'ansible_connection' from source: unknown 11044 1726853257.28488: variable 'ansible_module_compression' from source: unknown 11044 1726853257.28490: variable 'ansible_shell_type' from source: unknown 11044 1726853257.28493: variable 'ansible_shell_executable' from source: unknown 11044 1726853257.28495: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.28497: variable 'ansible_pipelining' from source: unknown 11044 1726853257.28500: variable 'ansible_timeout' from source: unknown 11044 1726853257.28503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.28601: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853257.28609: variable 'omit' from source: magic vars 11044 1726853257.28616: starting attempt loop 11044 1726853257.28619: running the handler 11044 1726853257.28694: variable 'lsr_net_profile_exists' from source: set_fact 11044 1726853257.28698: Evaluated conditional (lsr_net_profile_exists): True 11044 1726853257.28704: handler run complete 11044 1726853257.28715: attempt loop complete, returning result 11044 1726853257.28718: _execute() done 11044 1726853257.28723: dumping result to json 11044 1726853257.28725: done dumping result, returning 11044 1726853257.28728: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.1' [02083763-bbaf-c5a6-f857-00000000026f] 11044 1726853257.28798: sending task result for task 02083763-bbaf-c5a6-f857-00000000026f 11044 1726853257.28860: done sending task result for task 02083763-bbaf-c5a6-f857-00000000026f 11044 1726853257.28862: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11044 1726853257.28907: no more pending results, returning what we have 11044 1726853257.28909: results queue empty 11044 1726853257.28910: checking for any_errors_fatal 11044 1726853257.28914: done checking for any_errors_fatal 11044 1726853257.28915: checking for max_fail_percentage 11044 1726853257.28916: done checking for max_fail_percentage 11044 1726853257.28917: checking to see if all hosts have failed and the running result is not ok 11044 1726853257.28918: done checking to see if all hosts have failed 11044 1726853257.28919: getting the remaining hosts for this loop 11044 1726853257.28920: done getting the remaining hosts for this loop 11044 1726853257.28922: getting the next task for host managed_node1 11044 1726853257.28927: done getting next task for host managed_node1 11044 1726853257.28929: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11044 1726853257.28931: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853257.28934: getting variables 11044 1726853257.28935: in VariableManager get_vars() 11044 1726853257.28972: Calling all_inventory to load vars for managed_node1 11044 1726853257.28975: Calling groups_inventory to load vars for managed_node1 11044 1726853257.28978: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853257.28986: Calling all_plugins_play to load vars for managed_node1 11044 1726853257.28989: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853257.28991: Calling groups_plugins_play to load vars for managed_node1 11044 1726853257.29729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853257.30593: done with get_vars() 11044 1726853257.30608: done getting variables 11044 1726853257.30652: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853257.30731: variable 'profile' from source: include params 11044 1726853257.30734: variable 'item' from source: include params 11044 1726853257.30777: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:27:37 -0400 (0:00:00.036) 0:00:21.683 ****** 11044 1726853257.30802: entering _queue_task() for managed_node1/assert 11044 1726853257.31037: worker is 1 (out of 1 available) 11044 1726853257.31054: exiting _queue_task() for managed_node1/assert 11044 1726853257.31068: done queuing things up, now waiting for results queue to drain 11044 1726853257.31069: waiting for pending results... 11044 1726853257.31235: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' 11044 1726853257.31308: in run() - task 02083763-bbaf-c5a6-f857-000000000270 11044 1726853257.31319: variable 'ansible_search_path' from source: unknown 11044 1726853257.31323: variable 'ansible_search_path' from source: unknown 11044 1726853257.31351: calling self._execute() 11044 1726853257.31429: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.31433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.31442: variable 'omit' from source: magic vars 11044 1726853257.31701: variable 'ansible_distribution_major_version' from source: facts 11044 1726853257.31710: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853257.31716: variable 'omit' from source: magic vars 11044 1726853257.31748: variable 'omit' from source: magic vars 11044 1726853257.31823: variable 'profile' from source: include params 11044 1726853257.31826: variable 'item' from source: include params 11044 1726853257.31875: variable 'item' from source: include params 11044 1726853257.31891: variable 'omit' from source: magic vars 11044 1726853257.31923: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853257.31949: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853257.31973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853257.31986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853257.31997: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853257.32021: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853257.32024: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.32027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.32100: Set connection var ansible_timeout to 10 11044 1726853257.32107: Set connection var ansible_shell_executable to /bin/sh 11044 1726853257.32110: Set connection var ansible_shell_type to sh 11044 1726853257.32115: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853257.32120: Set connection var ansible_connection to ssh 11044 1726853257.32125: Set connection var ansible_pipelining to False 11044 1726853257.32143: variable 'ansible_shell_executable' from source: unknown 11044 1726853257.32148: variable 'ansible_connection' from source: unknown 11044 1726853257.32151: variable 'ansible_module_compression' from source: unknown 11044 1726853257.32153: variable 'ansible_shell_type' from source: unknown 11044 1726853257.32155: variable 'ansible_shell_executable' from source: unknown 11044 1726853257.32159: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.32161: variable 'ansible_pipelining' from source: unknown 11044 1726853257.32165: variable 'ansible_timeout' from source: unknown 11044 1726853257.32168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.32260: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853257.32269: variable 'omit' from source: magic vars 11044 1726853257.32275: starting attempt loop 11044 1726853257.32288: running the handler 11044 1726853257.32352: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11044 1726853257.32355: Evaluated conditional (lsr_net_profile_ansible_managed): True 11044 1726853257.32361: handler run complete 11044 1726853257.32373: attempt loop complete, returning result 11044 1726853257.32376: _execute() done 11044 1726853257.32379: dumping result to json 11044 1726853257.32381: done dumping result, returning 11044 1726853257.32389: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' [02083763-bbaf-c5a6-f857-000000000270] 11044 1726853257.32391: sending task result for task 02083763-bbaf-c5a6-f857-000000000270 11044 1726853257.32470: done sending task result for task 02083763-bbaf-c5a6-f857-000000000270 11044 1726853257.32474: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11044 1726853257.32542: no more pending results, returning what we have 11044 1726853257.32547: results queue empty 11044 1726853257.32549: checking for any_errors_fatal 11044 1726853257.32555: done checking for any_errors_fatal 11044 1726853257.32556: checking for max_fail_percentage 11044 1726853257.32557: done checking for max_fail_percentage 11044 1726853257.32558: checking to see if all hosts have failed and the running result is not ok 11044 1726853257.32559: done checking to see if all hosts have failed 11044 1726853257.32560: getting the remaining hosts for this loop 11044 1726853257.32561: done getting the remaining hosts for this loop 11044 1726853257.32564: getting the next task for host managed_node1 11044 1726853257.32569: done getting next task for host managed_node1 11044 1726853257.32573: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11044 1726853257.32575: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853257.32579: getting variables 11044 1726853257.32580: in VariableManager get_vars() 11044 1726853257.32612: Calling all_inventory to load vars for managed_node1 11044 1726853257.32615: Calling groups_inventory to load vars for managed_node1 11044 1726853257.32617: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853257.32626: Calling all_plugins_play to load vars for managed_node1 11044 1726853257.32629: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853257.32631: Calling groups_plugins_play to load vars for managed_node1 11044 1726853257.33483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853257.34335: done with get_vars() 11044 1726853257.34351: done getting variables 11044 1726853257.34392: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853257.34468: variable 'profile' from source: include params 11044 1726853257.34473: variable 'item' from source: include params 11044 1726853257.34510: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:27:37 -0400 (0:00:00.037) 0:00:21.721 ****** 11044 1726853257.34539: entering _queue_task() for managed_node1/assert 11044 1726853257.34774: worker is 1 (out of 1 available) 11044 1726853257.34788: exiting _queue_task() for managed_node1/assert 11044 1726853257.34800: done queuing things up, now waiting for results queue to drain 11044 1726853257.34802: waiting for pending results... 11044 1726853257.34973: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.1 11044 1726853257.35047: in run() - task 02083763-bbaf-c5a6-f857-000000000271 11044 1726853257.35058: variable 'ansible_search_path' from source: unknown 11044 1726853257.35061: variable 'ansible_search_path' from source: unknown 11044 1726853257.35092: calling self._execute() 11044 1726853257.35167: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.35172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.35182: variable 'omit' from source: magic vars 11044 1726853257.35455: variable 'ansible_distribution_major_version' from source: facts 11044 1726853257.35465: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853257.35479: variable 'omit' from source: magic vars 11044 1726853257.35506: variable 'omit' from source: magic vars 11044 1726853257.35581: variable 'profile' from source: include params 11044 1726853257.35584: variable 'item' from source: include params 11044 1726853257.35627: variable 'item' from source: include params 11044 1726853257.35641: variable 'omit' from source: magic vars 11044 1726853257.35676: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853257.35707: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853257.35723: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853257.35736: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853257.35749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853257.35770: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853257.35775: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.35777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.35849: Set connection var ansible_timeout to 10 11044 1726853257.35854: Set connection var ansible_shell_executable to /bin/sh 11044 1726853257.35857: Set connection var ansible_shell_type to sh 11044 1726853257.35862: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853257.35867: Set connection var ansible_connection to ssh 11044 1726853257.35874: Set connection var ansible_pipelining to False 11044 1726853257.35891: variable 'ansible_shell_executable' from source: unknown 11044 1726853257.35896: variable 'ansible_connection' from source: unknown 11044 1726853257.35899: variable 'ansible_module_compression' from source: unknown 11044 1726853257.35901: variable 'ansible_shell_type' from source: unknown 11044 1726853257.35904: variable 'ansible_shell_executable' from source: unknown 11044 1726853257.35906: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.35908: variable 'ansible_pipelining' from source: unknown 11044 1726853257.35911: variable 'ansible_timeout' from source: unknown 11044 1726853257.35914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.36008: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853257.36016: variable 'omit' from source: magic vars 11044 1726853257.36024: starting attempt loop 11044 1726853257.36027: running the handler 11044 1726853257.36101: variable 'lsr_net_profile_fingerprint' from source: set_fact 11044 1726853257.36105: Evaluated conditional (lsr_net_profile_fingerprint): True 11044 1726853257.36110: handler run complete 11044 1726853257.36121: attempt loop complete, returning result 11044 1726853257.36124: _execute() done 11044 1726853257.36126: dumping result to json 11044 1726853257.36129: done dumping result, returning 11044 1726853257.36135: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.1 [02083763-bbaf-c5a6-f857-000000000271] 11044 1726853257.36139: sending task result for task 02083763-bbaf-c5a6-f857-000000000271 11044 1726853257.36219: done sending task result for task 02083763-bbaf-c5a6-f857-000000000271 11044 1726853257.36222: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11044 1726853257.36299: no more pending results, returning what we have 11044 1726853257.36302: results queue empty 11044 1726853257.36303: checking for any_errors_fatal 11044 1726853257.36307: done checking for any_errors_fatal 11044 1726853257.36308: checking for max_fail_percentage 11044 1726853257.36309: done checking for max_fail_percentage 11044 1726853257.36310: checking to see if all hosts have failed and the running result is not ok 11044 1726853257.36311: done checking to see if all hosts have failed 11044 1726853257.36312: getting the remaining hosts for this loop 11044 1726853257.36313: done getting the remaining hosts for this loop 11044 1726853257.36316: getting the next task for host managed_node1 11044 1726853257.36322: done getting next task for host managed_node1 11044 1726853257.36325: ^ task is: TASK: ** TEST check polling interval 11044 1726853257.36326: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853257.36330: getting variables 11044 1726853257.36331: in VariableManager get_vars() 11044 1726853257.36373: Calling all_inventory to load vars for managed_node1 11044 1726853257.36376: Calling groups_inventory to load vars for managed_node1 11044 1726853257.36378: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853257.36391: Calling all_plugins_play to load vars for managed_node1 11044 1726853257.36393: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853257.36396: Calling groups_plugins_play to load vars for managed_node1 11044 1726853257.37152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853257.38119: done with get_vars() 11044 1726853257.38134: done getting variables 11044 1726853257.38179: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:75 Friday 20 September 2024 13:27:37 -0400 (0:00:00.036) 0:00:21.757 ****** 11044 1726853257.38201: entering _queue_task() for managed_node1/command 11044 1726853257.38449: worker is 1 (out of 1 available) 11044 1726853257.38463: exiting _queue_task() for managed_node1/command 11044 1726853257.38476: done queuing things up, now waiting for results queue to drain 11044 1726853257.38477: waiting for pending results... 11044 1726853257.38652: running TaskExecutor() for managed_node1/TASK: ** TEST check polling interval 11044 1726853257.38711: in run() - task 02083763-bbaf-c5a6-f857-000000000071 11044 1726853257.38724: variable 'ansible_search_path' from source: unknown 11044 1726853257.38754: calling self._execute() 11044 1726853257.38842: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.38846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.38857: variable 'omit' from source: magic vars 11044 1726853257.39134: variable 'ansible_distribution_major_version' from source: facts 11044 1726853257.39176: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853257.39180: variable 'omit' from source: magic vars 11044 1726853257.39183: variable 'omit' from source: magic vars 11044 1726853257.39237: variable 'controller_device' from source: play vars 11044 1726853257.39256: variable 'omit' from source: magic vars 11044 1726853257.39291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853257.39318: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853257.39335: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853257.39348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853257.39366: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853257.39391: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853257.39394: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.39397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.39469: Set connection var ansible_timeout to 10 11044 1726853257.39475: Set connection var ansible_shell_executable to /bin/sh 11044 1726853257.39477: Set connection var ansible_shell_type to sh 11044 1726853257.39482: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853257.39487: Set connection var ansible_connection to ssh 11044 1726853257.39493: Set connection var ansible_pipelining to False 11044 1726853257.39511: variable 'ansible_shell_executable' from source: unknown 11044 1726853257.39514: variable 'ansible_connection' from source: unknown 11044 1726853257.39517: variable 'ansible_module_compression' from source: unknown 11044 1726853257.39519: variable 'ansible_shell_type' from source: unknown 11044 1726853257.39521: variable 'ansible_shell_executable' from source: unknown 11044 1726853257.39523: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853257.39527: variable 'ansible_pipelining' from source: unknown 11044 1726853257.39529: variable 'ansible_timeout' from source: unknown 11044 1726853257.39534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853257.39631: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853257.39641: variable 'omit' from source: magic vars 11044 1726853257.39648: starting attempt loop 11044 1726853257.39651: running the handler 11044 1726853257.39662: _low_level_execute_command(): starting 11044 1726853257.39669: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853257.40193: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853257.40196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853257.40199: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853257.40201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853257.40251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853257.40257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853257.40272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853257.40316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853257.41990: stdout chunk (state=3): >>>/root <<< 11044 1726853257.42094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853257.42122: stderr chunk (state=3): >>><<< 11044 1726853257.42126: stdout chunk (state=3): >>><<< 11044 1726853257.42151: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853257.42164: _low_level_execute_command(): starting 11044 1726853257.42169: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853257.42151-12159-209176492154510 `" && echo ansible-tmp-1726853257.42151-12159-209176492154510="` echo /root/.ansible/tmp/ansible-tmp-1726853257.42151-12159-209176492154510 `" ) && sleep 0' 11044 1726853257.42616: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853257.42620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853257.42623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 11044 1726853257.42632: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853257.42635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853257.42679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853257.42683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853257.42686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853257.42725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853257.44617: stdout chunk (state=3): >>>ansible-tmp-1726853257.42151-12159-209176492154510=/root/.ansible/tmp/ansible-tmp-1726853257.42151-12159-209176492154510 <<< 11044 1726853257.44721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853257.44751: stderr chunk (state=3): >>><<< 11044 1726853257.44754: stdout chunk (state=3): >>><<< 11044 1726853257.44770: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853257.42151-12159-209176492154510=/root/.ansible/tmp/ansible-tmp-1726853257.42151-12159-209176492154510 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853257.44800: variable 'ansible_module_compression' from source: unknown 11044 1726853257.44840: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11044 1726853257.44877: variable 'ansible_facts' from source: unknown 11044 1726853257.44926: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853257.42151-12159-209176492154510/AnsiballZ_command.py 11044 1726853257.45030: Sending initial data 11044 1726853257.45034: Sent initial data (154 bytes) 11044 1726853257.45476: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853257.45479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853257.45482: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853257.45484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853257.45486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853257.45535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853257.45546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853257.45580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853257.47118: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11044 1726853257.47125: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853257.47156: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853257.47206: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpnj59ff_3 /root/.ansible/tmp/ansible-tmp-1726853257.42151-12159-209176492154510/AnsiballZ_command.py <<< 11044 1726853257.47209: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853257.42151-12159-209176492154510/AnsiballZ_command.py" <<< 11044 1726853257.47241: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpnj59ff_3" to remote "/root/.ansible/tmp/ansible-tmp-1726853257.42151-12159-209176492154510/AnsiballZ_command.py" <<< 11044 1726853257.47252: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853257.42151-12159-209176492154510/AnsiballZ_command.py" <<< 11044 1726853257.47761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853257.47803: stderr chunk (state=3): >>><<< 11044 1726853257.47806: stdout chunk (state=3): >>><<< 11044 1726853257.47822: done transferring module to remote 11044 1726853257.47830: _low_level_execute_command(): starting 11044 1726853257.47835: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853257.42151-12159-209176492154510/ /root/.ansible/tmp/ansible-tmp-1726853257.42151-12159-209176492154510/AnsiballZ_command.py && sleep 0' 11044 1726853257.48264: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853257.48267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853257.48270: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 11044 1726853257.48275: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853257.48322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853257.48326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853257.48329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853257.48377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853257.50102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853257.50124: stderr chunk (state=3): >>><<< 11044 1726853257.50127: stdout chunk (state=3): >>><<< 11044 1726853257.50139: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853257.50142: _low_level_execute_command(): starting 11044 1726853257.50150: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853257.42151-12159-209176492154510/AnsiballZ_command.py && sleep 0' 11044 1726853257.50573: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853257.50577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853257.50588: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853257.50650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853257.50654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853257.50657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853257.50698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853258.66243: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/deprecated-bond"], "start": "2024-09-20 13:27:37.657163", "end": "2024-09-20 13:27:38.661630", "delta": "0:00:01.004467", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11044 1726853258.67924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853258.67954: stderr chunk (state=3): >>><<< 11044 1726853258.67958: stdout chunk (state=3): >>><<< 11044 1726853258.67978: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/deprecated-bond"], "start": "2024-09-20 13:27:37.657163", "end": "2024-09-20 13:27:38.661630", "delta": "0:00:01.004467", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853258.68006: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853257.42151-12159-209176492154510/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853258.68012: _low_level_execute_command(): starting 11044 1726853258.68017: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853257.42151-12159-209176492154510/ > /dev/null 2>&1 && sleep 0' 11044 1726853258.68485: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853258.68488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853258.68491: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853258.68498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853258.68548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853258.68552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853258.68556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853258.68596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853258.70460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853258.70491: stderr chunk (state=3): >>><<< 11044 1726853258.70494: stdout chunk (state=3): >>><<< 11044 1726853258.70508: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853258.70514: handler run complete 11044 1726853258.70533: Evaluated conditional (False): False 11044 1726853258.70651: variable 'result' from source: unknown 11044 1726853258.70663: Evaluated conditional ('110' in result.stdout): True 11044 1726853258.70674: attempt loop complete, returning result 11044 1726853258.70677: _execute() done 11044 1726853258.70680: dumping result to json 11044 1726853258.70686: done dumping result, returning 11044 1726853258.70693: done running TaskExecutor() for managed_node1/TASK: ** TEST check polling interval [02083763-bbaf-c5a6-f857-000000000071] 11044 1726853258.70695: sending task result for task 02083763-bbaf-c5a6-f857-000000000071 11044 1726853258.70794: done sending task result for task 02083763-bbaf-c5a6-f857-000000000071 11044 1726853258.70796: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/deprecated-bond" ], "delta": "0:00:01.004467", "end": "2024-09-20 13:27:38.661630", "rc": 0, "start": "2024-09-20 13:27:37.657163" } STDOUT: MII Polling Interval (ms): 110 11044 1726853258.70867: no more pending results, returning what we have 11044 1726853258.70870: results queue empty 11044 1726853258.70873: checking for any_errors_fatal 11044 1726853258.70880: done checking for any_errors_fatal 11044 1726853258.70881: checking for max_fail_percentage 11044 1726853258.70883: done checking for max_fail_percentage 11044 1726853258.70884: checking to see if all hosts have failed and the running result is not ok 11044 1726853258.70885: done checking to see if all hosts have failed 11044 1726853258.70885: getting the remaining hosts for this loop 11044 1726853258.70887: done getting the remaining hosts for this loop 11044 1726853258.70890: getting the next task for host managed_node1 11044 1726853258.70895: done getting next task for host managed_node1 11044 1726853258.70898: ^ task is: TASK: ** TEST check IPv4 11044 1726853258.70900: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853258.70903: getting variables 11044 1726853258.70906: in VariableManager get_vars() 11044 1726853258.70949: Calling all_inventory to load vars for managed_node1 11044 1726853258.70952: Calling groups_inventory to load vars for managed_node1 11044 1726853258.70954: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853258.70964: Calling all_plugins_play to load vars for managed_node1 11044 1726853258.70966: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853258.70969: Calling groups_plugins_play to load vars for managed_node1 11044 1726853258.71776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853258.72641: done with get_vars() 11044 1726853258.72662: done getting variables 11044 1726853258.72708: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:80 Friday 20 September 2024 13:27:38 -0400 (0:00:01.345) 0:00:23.103 ****** 11044 1726853258.72730: entering _queue_task() for managed_node1/command 11044 1726853258.72996: worker is 1 (out of 1 available) 11044 1726853258.73009: exiting _queue_task() for managed_node1/command 11044 1726853258.73022: done queuing things up, now waiting for results queue to drain 11044 1726853258.73024: waiting for pending results... 11044 1726853258.73204: running TaskExecutor() for managed_node1/TASK: ** TEST check IPv4 11044 1726853258.73273: in run() - task 02083763-bbaf-c5a6-f857-000000000072 11044 1726853258.73286: variable 'ansible_search_path' from source: unknown 11044 1726853258.73315: calling self._execute() 11044 1726853258.73394: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853258.73397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853258.73406: variable 'omit' from source: magic vars 11044 1726853258.73691: variable 'ansible_distribution_major_version' from source: facts 11044 1726853258.73695: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853258.73701: variable 'omit' from source: magic vars 11044 1726853258.73717: variable 'omit' from source: magic vars 11044 1726853258.73787: variable 'controller_device' from source: play vars 11044 1726853258.73805: variable 'omit' from source: magic vars 11044 1726853258.73840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853258.73868: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853258.73887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853258.73902: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853258.73912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853258.73937: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853258.73940: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853258.73943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853258.74019: Set connection var ansible_timeout to 10 11044 1726853258.74025: Set connection var ansible_shell_executable to /bin/sh 11044 1726853258.74027: Set connection var ansible_shell_type to sh 11044 1726853258.74030: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853258.74036: Set connection var ansible_connection to ssh 11044 1726853258.74041: Set connection var ansible_pipelining to False 11044 1726853258.74061: variable 'ansible_shell_executable' from source: unknown 11044 1726853258.74064: variable 'ansible_connection' from source: unknown 11044 1726853258.74067: variable 'ansible_module_compression' from source: unknown 11044 1726853258.74069: variable 'ansible_shell_type' from source: unknown 11044 1726853258.74074: variable 'ansible_shell_executable' from source: unknown 11044 1726853258.74076: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853258.74078: variable 'ansible_pipelining' from source: unknown 11044 1726853258.74080: variable 'ansible_timeout' from source: unknown 11044 1726853258.74085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853258.74185: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853258.74194: variable 'omit' from source: magic vars 11044 1726853258.74199: starting attempt loop 11044 1726853258.74201: running the handler 11044 1726853258.74215: _low_level_execute_command(): starting 11044 1726853258.74221: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853258.74749: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853258.74753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853258.74757: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853258.74760: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853258.74812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853258.74815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853258.74818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853258.74867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853258.76570: stdout chunk (state=3): >>>/root <<< 11044 1726853258.76663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853258.76700: stderr chunk (state=3): >>><<< 11044 1726853258.76703: stdout chunk (state=3): >>><<< 11044 1726853258.76726: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853258.76738: _low_level_execute_command(): starting 11044 1726853258.76746: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853258.767253-12190-265259348264262 `" && echo ansible-tmp-1726853258.767253-12190-265259348264262="` echo /root/.ansible/tmp/ansible-tmp-1726853258.767253-12190-265259348264262 `" ) && sleep 0' 11044 1726853258.77203: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853258.77209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853258.77219: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 11044 1726853258.77221: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853258.77224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853258.77268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853258.77275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853258.77281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853258.77322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853258.79246: stdout chunk (state=3): >>>ansible-tmp-1726853258.767253-12190-265259348264262=/root/.ansible/tmp/ansible-tmp-1726853258.767253-12190-265259348264262 <<< 11044 1726853258.79358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853258.79391: stderr chunk (state=3): >>><<< 11044 1726853258.79394: stdout chunk (state=3): >>><<< 11044 1726853258.79411: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853258.767253-12190-265259348264262=/root/.ansible/tmp/ansible-tmp-1726853258.767253-12190-265259348264262 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853258.79439: variable 'ansible_module_compression' from source: unknown 11044 1726853258.79483: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11044 1726853258.79520: variable 'ansible_facts' from source: unknown 11044 1726853258.79569: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853258.767253-12190-265259348264262/AnsiballZ_command.py 11044 1726853258.79676: Sending initial data 11044 1726853258.79680: Sent initial data (155 bytes) 11044 1726853258.80133: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853258.80136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853258.80139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11044 1726853258.80143: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853258.80148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853258.80198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853258.80201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853258.80204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853258.80246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853258.81846: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853258.81885: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853258.81925: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmp15ll6taz /root/.ansible/tmp/ansible-tmp-1726853258.767253-12190-265259348264262/AnsiballZ_command.py <<< 11044 1726853258.81927: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853258.767253-12190-265259348264262/AnsiballZ_command.py" <<< 11044 1726853258.81962: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmp15ll6taz" to remote "/root/.ansible/tmp/ansible-tmp-1726853258.767253-12190-265259348264262/AnsiballZ_command.py" <<< 11044 1726853258.81969: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853258.767253-12190-265259348264262/AnsiballZ_command.py" <<< 11044 1726853258.82493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853258.82540: stderr chunk (state=3): >>><<< 11044 1726853258.82543: stdout chunk (state=3): >>><<< 11044 1726853258.82572: done transferring module to remote 11044 1726853258.82582: _low_level_execute_command(): starting 11044 1726853258.82587: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853258.767253-12190-265259348264262/ /root/.ansible/tmp/ansible-tmp-1726853258.767253-12190-265259348264262/AnsiballZ_command.py && sleep 0' 11044 1726853258.83042: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853258.83045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853258.83047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853258.83049: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853258.83056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853258.83110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853258.83116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853258.83119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853258.83157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853258.85017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853258.85042: stderr chunk (state=3): >>><<< 11044 1726853258.85048: stdout chunk (state=3): >>><<< 11044 1726853258.85060: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853258.85063: _low_level_execute_command(): starting 11044 1726853258.85068: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853258.767253-12190-265259348264262/AnsiballZ_command.py && sleep 0' 11044 1726853258.85519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853258.85523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853258.85534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853258.85595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853258.85598: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853258.85605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853258.85651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853259.01385: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.220/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond\n valid_lft 236sec preferred_lft 236sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "deprecated-bond"], "start": "2024-09-20 13:27:39.009134", "end": "2024-09-20 13:27:39.013015", "delta": "0:00:00.003881", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11044 1726853259.02995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853259.03023: stderr chunk (state=3): >>><<< 11044 1726853259.03027: stdout chunk (state=3): >>><<< 11044 1726853259.03045: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.220/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond\n valid_lft 236sec preferred_lft 236sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "deprecated-bond"], "start": "2024-09-20 13:27:39.009134", "end": "2024-09-20 13:27:39.013015", "delta": "0:00:00.003881", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853259.03076: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853258.767253-12190-265259348264262/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853259.03083: _low_level_execute_command(): starting 11044 1726853259.03090: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853258.767253-12190-265259348264262/ > /dev/null 2>&1 && sleep 0' 11044 1726853259.03547: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853259.03551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853259.03553: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853259.03557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853259.03610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853259.03613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853259.03615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853259.03660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853259.05493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853259.05521: stderr chunk (state=3): >>><<< 11044 1726853259.05524: stdout chunk (state=3): >>><<< 11044 1726853259.05537: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853259.05543: handler run complete 11044 1726853259.05562: Evaluated conditional (False): False 11044 1726853259.05676: variable 'result' from source: set_fact 11044 1726853259.05690: Evaluated conditional ('192.0.2' in result.stdout): True 11044 1726853259.05699: attempt loop complete, returning result 11044 1726853259.05702: _execute() done 11044 1726853259.05705: dumping result to json 11044 1726853259.05709: done dumping result, returning 11044 1726853259.05721: done running TaskExecutor() for managed_node1/TASK: ** TEST check IPv4 [02083763-bbaf-c5a6-f857-000000000072] 11044 1726853259.05723: sending task result for task 02083763-bbaf-c5a6-f857-000000000072 11044 1726853259.05840: done sending task result for task 02083763-bbaf-c5a6-f857-000000000072 11044 1726853259.05843: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "deprecated-bond" ], "delta": "0:00:00.003881", "end": "2024-09-20 13:27:39.013015", "rc": 0, "start": "2024-09-20 13:27:39.009134" } STDOUT: 13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.220/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond valid_lft 236sec preferred_lft 236sec 11044 1726853259.05922: no more pending results, returning what we have 11044 1726853259.05925: results queue empty 11044 1726853259.05926: checking for any_errors_fatal 11044 1726853259.05934: done checking for any_errors_fatal 11044 1726853259.05935: checking for max_fail_percentage 11044 1726853259.05936: done checking for max_fail_percentage 11044 1726853259.05938: checking to see if all hosts have failed and the running result is not ok 11044 1726853259.05938: done checking to see if all hosts have failed 11044 1726853259.05939: getting the remaining hosts for this loop 11044 1726853259.05940: done getting the remaining hosts for this loop 11044 1726853259.05943: getting the next task for host managed_node1 11044 1726853259.05951: done getting next task for host managed_node1 11044 1726853259.05955: ^ task is: TASK: ** TEST check IPv6 11044 1726853259.05957: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853259.05960: getting variables 11044 1726853259.05961: in VariableManager get_vars() 11044 1726853259.05998: Calling all_inventory to load vars for managed_node1 11044 1726853259.06001: Calling groups_inventory to load vars for managed_node1 11044 1726853259.06003: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853259.06012: Calling all_plugins_play to load vars for managed_node1 11044 1726853259.06015: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853259.06017: Calling groups_plugins_play to load vars for managed_node1 11044 1726853259.06927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853259.07769: done with get_vars() 11044 1726853259.07786: done getting variables 11044 1726853259.07831: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:87 Friday 20 September 2024 13:27:39 -0400 (0:00:00.351) 0:00:23.454 ****** 11044 1726853259.07853: entering _queue_task() for managed_node1/command 11044 1726853259.08096: worker is 1 (out of 1 available) 11044 1726853259.08110: exiting _queue_task() for managed_node1/command 11044 1726853259.08122: done queuing things up, now waiting for results queue to drain 11044 1726853259.08124: waiting for pending results... 11044 1726853259.08300: running TaskExecutor() for managed_node1/TASK: ** TEST check IPv6 11044 1726853259.08361: in run() - task 02083763-bbaf-c5a6-f857-000000000073 11044 1726853259.08374: variable 'ansible_search_path' from source: unknown 11044 1726853259.08403: calling self._execute() 11044 1726853259.08487: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853259.08491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853259.08500: variable 'omit' from source: magic vars 11044 1726853259.08868: variable 'ansible_distribution_major_version' from source: facts 11044 1726853259.08880: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853259.08884: variable 'omit' from source: magic vars 11044 1726853259.08887: variable 'omit' from source: magic vars 11044 1726853259.09025: variable 'controller_device' from source: play vars 11044 1726853259.09054: variable 'omit' from source: magic vars 11044 1726853259.09099: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853259.09138: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853259.09157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853259.09170: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853259.09184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853259.09207: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853259.09211: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853259.09213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853259.09284: Set connection var ansible_timeout to 10 11044 1726853259.09293: Set connection var ansible_shell_executable to /bin/sh 11044 1726853259.09296: Set connection var ansible_shell_type to sh 11044 1726853259.09298: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853259.09303: Set connection var ansible_connection to ssh 11044 1726853259.09308: Set connection var ansible_pipelining to False 11044 1726853259.09325: variable 'ansible_shell_executable' from source: unknown 11044 1726853259.09328: variable 'ansible_connection' from source: unknown 11044 1726853259.09332: variable 'ansible_module_compression' from source: unknown 11044 1726853259.09334: variable 'ansible_shell_type' from source: unknown 11044 1726853259.09336: variable 'ansible_shell_executable' from source: unknown 11044 1726853259.09339: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853259.09341: variable 'ansible_pipelining' from source: unknown 11044 1726853259.09343: variable 'ansible_timeout' from source: unknown 11044 1726853259.09348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853259.09447: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853259.09455: variable 'omit' from source: magic vars 11044 1726853259.09459: starting attempt loop 11044 1726853259.09462: running the handler 11044 1726853259.09479: _low_level_execute_command(): starting 11044 1726853259.09485: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853259.10006: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853259.10010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853259.10013: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853259.10015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853259.10067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853259.10073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853259.10076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853259.10120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853259.11825: stdout chunk (state=3): >>>/root <<< 11044 1726853259.11977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853259.11981: stdout chunk (state=3): >>><<< 11044 1726853259.11983: stderr chunk (state=3): >>><<< 11044 1726853259.12005: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853259.12107: _low_level_execute_command(): starting 11044 1726853259.12111: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853259.1201768-12205-163935209676169 `" && echo ansible-tmp-1726853259.1201768-12205-163935209676169="` echo /root/.ansible/tmp/ansible-tmp-1726853259.1201768-12205-163935209676169 `" ) && sleep 0' 11044 1726853259.12650: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853259.12676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853259.12730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853259.12750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853259.12805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853259.12839: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853259.12876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853259.12920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853259.14834: stdout chunk (state=3): >>>ansible-tmp-1726853259.1201768-12205-163935209676169=/root/.ansible/tmp/ansible-tmp-1726853259.1201768-12205-163935209676169 <<< 11044 1726853259.14978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853259.15000: stdout chunk (state=3): >>><<< 11044 1726853259.15003: stderr chunk (state=3): >>><<< 11044 1726853259.15176: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853259.1201768-12205-163935209676169=/root/.ansible/tmp/ansible-tmp-1726853259.1201768-12205-163935209676169 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853259.15180: variable 'ansible_module_compression' from source: unknown 11044 1726853259.15183: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11044 1726853259.15185: variable 'ansible_facts' from source: unknown 11044 1726853259.15256: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853259.1201768-12205-163935209676169/AnsiballZ_command.py 11044 1726853259.15433: Sending initial data 11044 1726853259.15442: Sent initial data (156 bytes) 11044 1726853259.16763: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853259.16799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853259.16811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853259.16819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853259.16992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853259.18627: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11044 1726853259.18631: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853259.18660: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853259.18715: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmp1lkqp4fb /root/.ansible/tmp/ansible-tmp-1726853259.1201768-12205-163935209676169/AnsiballZ_command.py <<< 11044 1726853259.18734: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853259.1201768-12205-163935209676169/AnsiballZ_command.py" <<< 11044 1726853259.18756: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmp1lkqp4fb" to remote "/root/.ansible/tmp/ansible-tmp-1726853259.1201768-12205-163935209676169/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853259.1201768-12205-163935209676169/AnsiballZ_command.py" <<< 11044 1726853259.20129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853259.20218: stderr chunk (state=3): >>><<< 11044 1726853259.20382: stdout chunk (state=3): >>><<< 11044 1726853259.20387: done transferring module to remote 11044 1726853259.20390: _low_level_execute_command(): starting 11044 1726853259.20393: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853259.1201768-12205-163935209676169/ /root/.ansible/tmp/ansible-tmp-1726853259.1201768-12205-163935209676169/AnsiballZ_command.py && sleep 0' 11044 1726853259.21668: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853259.21678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853259.21681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853259.21683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853259.21686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853259.21689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853259.21784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853259.21820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853259.24191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853259.24195: stdout chunk (state=3): >>><<< 11044 1726853259.24197: stderr chunk (state=3): >>><<< 11044 1726853259.24200: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853259.24202: _low_level_execute_command(): starting 11044 1726853259.24204: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853259.1201768-12205-163935209676169/AnsiballZ_command.py && sleep 0' 11044 1726853259.25078: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853259.25127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853259.25141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853259.25161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853259.25187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853259.25202: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853259.25212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853259.25299: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853259.25316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853259.25391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853259.41384: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::54/128 scope global dynamic noprefixroute \n valid_lft 236sec preferred_lft 236sec\n inet6 2001:db8::282e:89ff:fee9:8d5/64 scope global dynamic noprefixroute \n valid_lft 1794sec preferred_lft 1794sec\n inet6 fe80::282e:89ff:fee9:8d5/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "deprecated-bond"], "start": "2024-09-20 13:27:39.408990", "end": "2024-09-20 13:27:39.412958", "delta": "0:00:00.003968", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11044 1726853259.43015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853259.43019: stdout chunk (state=3): >>><<< 11044 1726853259.43022: stderr chunk (state=3): >>><<< 11044 1726853259.43167: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::54/128 scope global dynamic noprefixroute \n valid_lft 236sec preferred_lft 236sec\n inet6 2001:db8::282e:89ff:fee9:8d5/64 scope global dynamic noprefixroute \n valid_lft 1794sec preferred_lft 1794sec\n inet6 fe80::282e:89ff:fee9:8d5/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "deprecated-bond"], "start": "2024-09-20 13:27:39.408990", "end": "2024-09-20 13:27:39.412958", "delta": "0:00:00.003968", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853259.43175: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853259.1201768-12205-163935209676169/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853259.43177: _low_level_execute_command(): starting 11044 1726853259.43180: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853259.1201768-12205-163935209676169/ > /dev/null 2>&1 && sleep 0' 11044 1726853259.43744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853259.43759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853259.43782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853259.43799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853259.43850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853259.43913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853259.43931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853259.43959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853259.44025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853259.45983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853259.45986: stdout chunk (state=3): >>><<< 11044 1726853259.45988: stderr chunk (state=3): >>><<< 11044 1726853259.45991: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853259.45993: handler run complete 11044 1726853259.46011: Evaluated conditional (False): False 11044 1726853259.46198: variable 'result' from source: set_fact 11044 1726853259.46222: Evaluated conditional ('2001' in result.stdout): True 11044 1726853259.46240: attempt loop complete, returning result 11044 1726853259.46252: _execute() done 11044 1726853259.46259: dumping result to json 11044 1726853259.46269: done dumping result, returning 11044 1726853259.46293: done running TaskExecutor() for managed_node1/TASK: ** TEST check IPv6 [02083763-bbaf-c5a6-f857-000000000073] 11044 1726853259.46304: sending task result for task 02083763-bbaf-c5a6-f857-000000000073 ok: [managed_node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "deprecated-bond" ], "delta": "0:00:00.003968", "end": "2024-09-20 13:27:39.412958", "rc": 0, "start": "2024-09-20 13:27:39.408990" } STDOUT: 13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::54/128 scope global dynamic noprefixroute valid_lft 236sec preferred_lft 236sec inet6 2001:db8::282e:89ff:fee9:8d5/64 scope global dynamic noprefixroute valid_lft 1794sec preferred_lft 1794sec inet6 fe80::282e:89ff:fee9:8d5/64 scope link noprefixroute valid_lft forever preferred_lft forever 11044 1726853259.46557: no more pending results, returning what we have 11044 1726853259.46561: results queue empty 11044 1726853259.46562: checking for any_errors_fatal 11044 1726853259.46687: done checking for any_errors_fatal 11044 1726853259.46689: checking for max_fail_percentage 11044 1726853259.46692: done checking for max_fail_percentage 11044 1726853259.46693: checking to see if all hosts have failed and the running result is not ok 11044 1726853259.46694: done checking to see if all hosts have failed 11044 1726853259.46694: getting the remaining hosts for this loop 11044 1726853259.46696: done getting the remaining hosts for this loop 11044 1726853259.46700: getting the next task for host managed_node1 11044 1726853259.46712: done getting next task for host managed_node1 11044 1726853259.46717: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11044 1726853259.46721: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853259.46740: getting variables 11044 1726853259.46741: in VariableManager get_vars() 11044 1726853259.46895: Calling all_inventory to load vars for managed_node1 11044 1726853259.46905: Calling groups_inventory to load vars for managed_node1 11044 1726853259.46909: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853259.46915: done sending task result for task 02083763-bbaf-c5a6-f857-000000000073 11044 1726853259.46918: WORKER PROCESS EXITING 11044 1726853259.46928: Calling all_plugins_play to load vars for managed_node1 11044 1726853259.46932: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853259.46935: Calling groups_plugins_play to load vars for managed_node1 11044 1726853259.48429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853259.50112: done with get_vars() 11044 1726853259.50141: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:27:39 -0400 (0:00:00.424) 0:00:23.878 ****** 11044 1726853259.50267: entering _queue_task() for managed_node1/include_tasks 11044 1726853259.50657: worker is 1 (out of 1 available) 11044 1726853259.50874: exiting _queue_task() for managed_node1/include_tasks 11044 1726853259.50885: done queuing things up, now waiting for results queue to drain 11044 1726853259.50887: waiting for pending results... 11044 1726853259.50989: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11044 1726853259.51173: in run() - task 02083763-bbaf-c5a6-f857-00000000007d 11044 1726853259.51200: variable 'ansible_search_path' from source: unknown 11044 1726853259.51209: variable 'ansible_search_path' from source: unknown 11044 1726853259.51259: calling self._execute() 11044 1726853259.51375: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853259.51388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853259.51406: variable 'omit' from source: magic vars 11044 1726853259.51855: variable 'ansible_distribution_major_version' from source: facts 11044 1726853259.51876: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853259.51890: _execute() done 11044 1726853259.51898: dumping result to json 11044 1726853259.51906: done dumping result, returning 11044 1726853259.51917: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-c5a6-f857-00000000007d] 11044 1726853259.51925: sending task result for task 02083763-bbaf-c5a6-f857-00000000007d 11044 1726853259.52124: no more pending results, returning what we have 11044 1726853259.52129: in VariableManager get_vars() 11044 1726853259.52182: Calling all_inventory to load vars for managed_node1 11044 1726853259.52185: Calling groups_inventory to load vars for managed_node1 11044 1726853259.52187: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853259.52200: Calling all_plugins_play to load vars for managed_node1 11044 1726853259.52204: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853259.52206: Calling groups_plugins_play to load vars for managed_node1 11044 1726853259.52930: done sending task result for task 02083763-bbaf-c5a6-f857-00000000007d 11044 1726853259.52933: WORKER PROCESS EXITING 11044 1726853259.54232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853259.56498: done with get_vars() 11044 1726853259.56532: variable 'ansible_search_path' from source: unknown 11044 1726853259.56533: variable 'ansible_search_path' from source: unknown 11044 1726853259.56584: we have included files to process 11044 1726853259.56585: generating all_blocks data 11044 1726853259.56587: done generating all_blocks data 11044 1726853259.56592: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11044 1726853259.56594: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11044 1726853259.56596: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11044 1726853259.57233: done processing included file 11044 1726853259.57235: iterating over new_blocks loaded from include file 11044 1726853259.57236: in VariableManager get_vars() 11044 1726853259.57267: done with get_vars() 11044 1726853259.57269: filtering new block on tags 11044 1726853259.57306: done filtering new block on tags 11044 1726853259.57309: in VariableManager get_vars() 11044 1726853259.57334: done with get_vars() 11044 1726853259.57335: filtering new block on tags 11044 1726853259.57387: done filtering new block on tags 11044 1726853259.57390: in VariableManager get_vars() 11044 1726853259.57415: done with get_vars() 11044 1726853259.57417: filtering new block on tags 11044 1726853259.57460: done filtering new block on tags 11044 1726853259.57462: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 11044 1726853259.57469: extending task lists for all hosts with included blocks 11044 1726853259.59709: done extending task lists 11044 1726853259.59711: done processing included files 11044 1726853259.59712: results queue empty 11044 1726853259.59712: checking for any_errors_fatal 11044 1726853259.59716: done checking for any_errors_fatal 11044 1726853259.59717: checking for max_fail_percentage 11044 1726853259.59718: done checking for max_fail_percentage 11044 1726853259.59719: checking to see if all hosts have failed and the running result is not ok 11044 1726853259.59720: done checking to see if all hosts have failed 11044 1726853259.59721: getting the remaining hosts for this loop 11044 1726853259.59722: done getting the remaining hosts for this loop 11044 1726853259.59724: getting the next task for host managed_node1 11044 1726853259.59729: done getting next task for host managed_node1 11044 1726853259.59732: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11044 1726853259.59736: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853259.59749: getting variables 11044 1726853259.59751: in VariableManager get_vars() 11044 1726853259.59774: Calling all_inventory to load vars for managed_node1 11044 1726853259.59777: Calling groups_inventory to load vars for managed_node1 11044 1726853259.59779: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853259.59785: Calling all_plugins_play to load vars for managed_node1 11044 1726853259.59788: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853259.59790: Calling groups_plugins_play to load vars for managed_node1 11044 1726853259.62260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853259.66847: done with get_vars() 11044 1726853259.66874: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:27:39 -0400 (0:00:00.166) 0:00:24.045 ****** 11044 1726853259.66959: entering _queue_task() for managed_node1/setup 11044 1726853259.67584: worker is 1 (out of 1 available) 11044 1726853259.67599: exiting _queue_task() for managed_node1/setup 11044 1726853259.67613: done queuing things up, now waiting for results queue to drain 11044 1726853259.67614: waiting for pending results... 11044 1726853259.67936: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11044 1726853259.68134: in run() - task 02083763-bbaf-c5a6-f857-000000000494 11044 1726853259.68156: variable 'ansible_search_path' from source: unknown 11044 1726853259.68163: variable 'ansible_search_path' from source: unknown 11044 1726853259.68207: calling self._execute() 11044 1726853259.68310: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853259.68326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853259.68337: variable 'omit' from source: magic vars 11044 1726853259.68727: variable 'ansible_distribution_major_version' from source: facts 11044 1726853259.68752: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853259.68993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853259.73275: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853259.73352: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853259.73581: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853259.73585: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853259.73676: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853259.73757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853259.73796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853259.73837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853259.73889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853259.73917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853259.73984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853259.74010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853259.74050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853259.74096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853259.74115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853259.74298: variable '__network_required_facts' from source: role '' defaults 11044 1726853259.74313: variable 'ansible_facts' from source: unknown 11044 1726853259.75136: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11044 1726853259.75147: when evaluation is False, skipping this task 11044 1726853259.75155: _execute() done 11044 1726853259.75161: dumping result to json 11044 1726853259.75168: done dumping result, returning 11044 1726853259.75183: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-c5a6-f857-000000000494] 11044 1726853259.75192: sending task result for task 02083763-bbaf-c5a6-f857-000000000494 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11044 1726853259.75379: no more pending results, returning what we have 11044 1726853259.75383: results queue empty 11044 1726853259.75384: checking for any_errors_fatal 11044 1726853259.75385: done checking for any_errors_fatal 11044 1726853259.75386: checking for max_fail_percentage 11044 1726853259.75388: done checking for max_fail_percentage 11044 1726853259.75389: checking to see if all hosts have failed and the running result is not ok 11044 1726853259.75390: done checking to see if all hosts have failed 11044 1726853259.75391: getting the remaining hosts for this loop 11044 1726853259.75392: done getting the remaining hosts for this loop 11044 1726853259.75396: getting the next task for host managed_node1 11044 1726853259.75406: done getting next task for host managed_node1 11044 1726853259.75410: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11044 1726853259.75415: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853259.75434: getting variables 11044 1726853259.75436: in VariableManager get_vars() 11044 1726853259.75483: Calling all_inventory to load vars for managed_node1 11044 1726853259.75486: Calling groups_inventory to load vars for managed_node1 11044 1726853259.75489: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853259.75499: Calling all_plugins_play to load vars for managed_node1 11044 1726853259.75502: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853259.75505: Calling groups_plugins_play to load vars for managed_node1 11044 1726853259.76186: done sending task result for task 02083763-bbaf-c5a6-f857-000000000494 11044 1726853259.76190: WORKER PROCESS EXITING 11044 1726853259.77243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853259.79673: done with get_vars() 11044 1726853259.79712: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:27:39 -0400 (0:00:00.128) 0:00:24.174 ****** 11044 1726853259.79838: entering _queue_task() for managed_node1/stat 11044 1726853259.80222: worker is 1 (out of 1 available) 11044 1726853259.80236: exiting _queue_task() for managed_node1/stat 11044 1726853259.80253: done queuing things up, now waiting for results queue to drain 11044 1726853259.80254: waiting for pending results... 11044 1726853259.80578: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 11044 1726853259.80937: in run() - task 02083763-bbaf-c5a6-f857-000000000496 11044 1726853259.80941: variable 'ansible_search_path' from source: unknown 11044 1726853259.80947: variable 'ansible_search_path' from source: unknown 11044 1726853259.80950: calling self._execute() 11044 1726853259.81061: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853259.81079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853259.81096: variable 'omit' from source: magic vars 11044 1726853259.81833: variable 'ansible_distribution_major_version' from source: facts 11044 1726853259.81956: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853259.82343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853259.82706: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853259.82754: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853259.82795: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853259.82827: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853259.82927: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853259.82956: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853259.82985: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853259.83018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853259.83115: variable '__network_is_ostree' from source: set_fact 11044 1726853259.83126: Evaluated conditional (not __network_is_ostree is defined): False 11044 1726853259.83133: when evaluation is False, skipping this task 11044 1726853259.83139: _execute() done 11044 1726853259.83148: dumping result to json 11044 1726853259.83156: done dumping result, returning 11044 1726853259.83166: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-c5a6-f857-000000000496] 11044 1726853259.83175: sending task result for task 02083763-bbaf-c5a6-f857-000000000496 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11044 1726853259.83321: no more pending results, returning what we have 11044 1726853259.83325: results queue empty 11044 1726853259.83327: checking for any_errors_fatal 11044 1726853259.83333: done checking for any_errors_fatal 11044 1726853259.83334: checking for max_fail_percentage 11044 1726853259.83336: done checking for max_fail_percentage 11044 1726853259.83337: checking to see if all hosts have failed and the running result is not ok 11044 1726853259.83338: done checking to see if all hosts have failed 11044 1726853259.83339: getting the remaining hosts for this loop 11044 1726853259.83340: done getting the remaining hosts for this loop 11044 1726853259.83346: getting the next task for host managed_node1 11044 1726853259.83354: done getting next task for host managed_node1 11044 1726853259.83358: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11044 1726853259.83362: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853259.83384: getting variables 11044 1726853259.83386: in VariableManager get_vars() 11044 1726853259.83431: Calling all_inventory to load vars for managed_node1 11044 1726853259.83434: Calling groups_inventory to load vars for managed_node1 11044 1726853259.83437: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853259.83451: Calling all_plugins_play to load vars for managed_node1 11044 1726853259.83454: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853259.83457: Calling groups_plugins_play to load vars for managed_node1 11044 1726853259.84084: done sending task result for task 02083763-bbaf-c5a6-f857-000000000496 11044 1726853259.84089: WORKER PROCESS EXITING 11044 1726853259.85503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853259.87439: done with get_vars() 11044 1726853259.87475: done getting variables 11044 1726853259.87568: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:27:39 -0400 (0:00:00.077) 0:00:24.251 ****** 11044 1726853259.87608: entering _queue_task() for managed_node1/set_fact 11044 1726853259.87991: worker is 1 (out of 1 available) 11044 1726853259.88180: exiting _queue_task() for managed_node1/set_fact 11044 1726853259.88192: done queuing things up, now waiting for results queue to drain 11044 1726853259.88193: waiting for pending results... 11044 1726853259.88333: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11044 1726853259.88524: in run() - task 02083763-bbaf-c5a6-f857-000000000497 11044 1726853259.88643: variable 'ansible_search_path' from source: unknown 11044 1726853259.88649: variable 'ansible_search_path' from source: unknown 11044 1726853259.88652: calling self._execute() 11044 1726853259.88713: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853259.88727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853259.88742: variable 'omit' from source: magic vars 11044 1726853259.89308: variable 'ansible_distribution_major_version' from source: facts 11044 1726853259.89369: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853259.89795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853259.90386: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853259.90502: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853259.90552: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853259.90624: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853259.90748: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853259.90790: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853259.90848: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853259.90903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853259.90998: variable '__network_is_ostree' from source: set_fact 11044 1726853259.91041: Evaluated conditional (not __network_is_ostree is defined): False 11044 1726853259.91047: when evaluation is False, skipping this task 11044 1726853259.91050: _execute() done 11044 1726853259.91052: dumping result to json 11044 1726853259.91054: done dumping result, returning 11044 1726853259.91057: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-c5a6-f857-000000000497] 11044 1726853259.91059: sending task result for task 02083763-bbaf-c5a6-f857-000000000497 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11044 1726853259.91309: no more pending results, returning what we have 11044 1726853259.91313: results queue empty 11044 1726853259.91314: checking for any_errors_fatal 11044 1726853259.91321: done checking for any_errors_fatal 11044 1726853259.91322: checking for max_fail_percentage 11044 1726853259.91324: done checking for max_fail_percentage 11044 1726853259.91325: checking to see if all hosts have failed and the running result is not ok 11044 1726853259.91326: done checking to see if all hosts have failed 11044 1726853259.91327: getting the remaining hosts for this loop 11044 1726853259.91328: done getting the remaining hosts for this loop 11044 1726853259.91332: getting the next task for host managed_node1 11044 1726853259.91342: done getting next task for host managed_node1 11044 1726853259.91349: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11044 1726853259.91355: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853259.91584: getting variables 11044 1726853259.91586: in VariableManager get_vars() 11044 1726853259.91622: Calling all_inventory to load vars for managed_node1 11044 1726853259.91624: Calling groups_inventory to load vars for managed_node1 11044 1726853259.91627: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853259.91635: Calling all_plugins_play to load vars for managed_node1 11044 1726853259.91637: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853259.91639: Calling groups_plugins_play to load vars for managed_node1 11044 1726853259.92185: done sending task result for task 02083763-bbaf-c5a6-f857-000000000497 11044 1726853259.92189: WORKER PROCESS EXITING 11044 1726853259.93132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853259.94786: done with get_vars() 11044 1726853259.94818: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:27:39 -0400 (0:00:00.073) 0:00:24.325 ****** 11044 1726853259.94932: entering _queue_task() for managed_node1/service_facts 11044 1726853259.95532: worker is 1 (out of 1 available) 11044 1726853259.95549: exiting _queue_task() for managed_node1/service_facts 11044 1726853259.95563: done queuing things up, now waiting for results queue to drain 11044 1726853259.95564: waiting for pending results... 11044 1726853259.96279: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 11044 1726853259.96507: in run() - task 02083763-bbaf-c5a6-f857-000000000499 11044 1726853259.96527: variable 'ansible_search_path' from source: unknown 11044 1726853259.96534: variable 'ansible_search_path' from source: unknown 11044 1726853259.96597: calling self._execute() 11044 1726853259.96708: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853259.96719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853259.96731: variable 'omit' from source: magic vars 11044 1726853259.97120: variable 'ansible_distribution_major_version' from source: facts 11044 1726853259.97137: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853259.97152: variable 'omit' from source: magic vars 11044 1726853259.97243: variable 'omit' from source: magic vars 11044 1726853259.97286: variable 'omit' from source: magic vars 11044 1726853259.97337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853259.97383: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853259.97409: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853259.97437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853259.97458: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853259.97493: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853259.97501: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853259.97508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853259.97615: Set connection var ansible_timeout to 10 11044 1726853259.97649: Set connection var ansible_shell_executable to /bin/sh 11044 1726853259.97651: Set connection var ansible_shell_type to sh 11044 1726853259.97654: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853259.97657: Set connection var ansible_connection to ssh 11044 1726853259.97756: Set connection var ansible_pipelining to False 11044 1726853259.97759: variable 'ansible_shell_executable' from source: unknown 11044 1726853259.97761: variable 'ansible_connection' from source: unknown 11044 1726853259.97764: variable 'ansible_module_compression' from source: unknown 11044 1726853259.97766: variable 'ansible_shell_type' from source: unknown 11044 1726853259.97768: variable 'ansible_shell_executable' from source: unknown 11044 1726853259.97770: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853259.97773: variable 'ansible_pipelining' from source: unknown 11044 1726853259.97776: variable 'ansible_timeout' from source: unknown 11044 1726853259.97778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853259.97939: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11044 1726853259.97958: variable 'omit' from source: magic vars 11044 1726853259.97975: starting attempt loop 11044 1726853259.97982: running the handler 11044 1726853259.97999: _low_level_execute_command(): starting 11044 1726853259.98010: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853259.98860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853259.98890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853259.98903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853259.98986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853260.00977: stdout chunk (state=3): >>>/root <<< 11044 1726853260.00981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853260.00983: stdout chunk (state=3): >>><<< 11044 1726853260.00988: stderr chunk (state=3): >>><<< 11044 1726853260.00992: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853260.00995: _low_level_execute_command(): starting 11044 1726853260.00997: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853260.008898-12245-152855283717947 `" && echo ansible-tmp-1726853260.008898-12245-152855283717947="` echo /root/.ansible/tmp/ansible-tmp-1726853260.008898-12245-152855283717947 `" ) && sleep 0' 11044 1726853260.02092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853260.02351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853260.02383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853260.04279: stdout chunk (state=3): >>>ansible-tmp-1726853260.008898-12245-152855283717947=/root/.ansible/tmp/ansible-tmp-1726853260.008898-12245-152855283717947 <<< 11044 1726853260.04394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853260.04437: stderr chunk (state=3): >>><<< 11044 1726853260.04451: stdout chunk (state=3): >>><<< 11044 1726853260.04777: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853260.008898-12245-152855283717947=/root/.ansible/tmp/ansible-tmp-1726853260.008898-12245-152855283717947 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853260.04781: variable 'ansible_module_compression' from source: unknown 11044 1726853260.04784: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 11044 1726853260.04786: variable 'ansible_facts' from source: unknown 11044 1726853260.05030: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853260.008898-12245-152855283717947/AnsiballZ_service_facts.py 11044 1726853260.05262: Sending initial data 11044 1726853260.05273: Sent initial data (161 bytes) 11044 1726853260.05770: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853260.05796: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853260.05813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853260.05909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853260.05930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853260.05948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853260.06016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853260.07561: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11044 1726853260.07590: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853260.07662: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853260.07713: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpr7kny1n8 /root/.ansible/tmp/ansible-tmp-1726853260.008898-12245-152855283717947/AnsiballZ_service_facts.py <<< 11044 1726853260.07735: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853260.008898-12245-152855283717947/AnsiballZ_service_facts.py" <<< 11044 1726853260.07766: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpr7kny1n8" to remote "/root/.ansible/tmp/ansible-tmp-1726853260.008898-12245-152855283717947/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853260.008898-12245-152855283717947/AnsiballZ_service_facts.py" <<< 11044 1726853260.08536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853260.08578: stderr chunk (state=3): >>><<< 11044 1726853260.08592: stdout chunk (state=3): >>><<< 11044 1726853260.08639: done transferring module to remote 11044 1726853260.08653: _low_level_execute_command(): starting 11044 1726853260.08661: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853260.008898-12245-152855283717947/ /root/.ansible/tmp/ansible-tmp-1726853260.008898-12245-152855283717947/AnsiballZ_service_facts.py && sleep 0' 11044 1726853260.09298: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853260.09313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853260.09382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853260.09403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853260.09455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853260.09478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853260.09525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853260.09563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853260.11377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853260.11381: stdout chunk (state=3): >>><<< 11044 1726853260.11383: stderr chunk (state=3): >>><<< 11044 1726853260.11385: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853260.11388: _low_level_execute_command(): starting 11044 1726853260.11390: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853260.008898-12245-152855283717947/AnsiballZ_service_facts.py && sleep 0' 11044 1726853260.11977: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853260.11987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853260.11997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853260.12011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853260.12030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853260.12080: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853260.12083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853260.12086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853260.12094: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 11044 1726853260.12097: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11044 1726853260.12099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853260.12230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853260.12234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853260.12245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853260.12249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853260.12286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853261.64560: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 11044 1726853261.64587: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 11044 1726853261.64596: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 11044 1726853261.64599: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 11044 1726853261.64637: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 11044 1726853261.64659: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 11044 1726853261.64675: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11044 1726853261.66118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853261.66151: stderr chunk (state=3): >>><<< 11044 1726853261.66154: stdout chunk (state=3): >>><<< 11044 1726853261.66177: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853261.66839: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853260.008898-12245-152855283717947/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853261.66848: _low_level_execute_command(): starting 11044 1726853261.66851: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853260.008898-12245-152855283717947/ > /dev/null 2>&1 && sleep 0' 11044 1726853261.67313: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853261.67319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853261.67321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853261.67323: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853261.67326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853261.67383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853261.67386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853261.67393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853261.67431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853261.69222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853261.69251: stderr chunk (state=3): >>><<< 11044 1726853261.69254: stdout chunk (state=3): >>><<< 11044 1726853261.69273: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853261.69279: handler run complete 11044 1726853261.69388: variable 'ansible_facts' from source: unknown 11044 1726853261.69482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853261.69754: variable 'ansible_facts' from source: unknown 11044 1726853261.69833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853261.69951: attempt loop complete, returning result 11044 1726853261.69954: _execute() done 11044 1726853261.69957: dumping result to json 11044 1726853261.69992: done dumping result, returning 11044 1726853261.70000: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-c5a6-f857-000000000499] 11044 1726853261.70003: sending task result for task 02083763-bbaf-c5a6-f857-000000000499 11044 1726853261.70720: done sending task result for task 02083763-bbaf-c5a6-f857-000000000499 11044 1726853261.70723: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11044 1726853261.70776: no more pending results, returning what we have 11044 1726853261.70778: results queue empty 11044 1726853261.70779: checking for any_errors_fatal 11044 1726853261.70780: done checking for any_errors_fatal 11044 1726853261.70781: checking for max_fail_percentage 11044 1726853261.70782: done checking for max_fail_percentage 11044 1726853261.70783: checking to see if all hosts have failed and the running result is not ok 11044 1726853261.70783: done checking to see if all hosts have failed 11044 1726853261.70783: getting the remaining hosts for this loop 11044 1726853261.70784: done getting the remaining hosts for this loop 11044 1726853261.70786: getting the next task for host managed_node1 11044 1726853261.70790: done getting next task for host managed_node1 11044 1726853261.70792: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11044 1726853261.70796: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853261.70806: getting variables 11044 1726853261.70807: in VariableManager get_vars() 11044 1726853261.70828: Calling all_inventory to load vars for managed_node1 11044 1726853261.70829: Calling groups_inventory to load vars for managed_node1 11044 1726853261.70831: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853261.70837: Calling all_plugins_play to load vars for managed_node1 11044 1726853261.70839: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853261.70841: Calling groups_plugins_play to load vars for managed_node1 11044 1726853261.71504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853261.72375: done with get_vars() 11044 1726853261.72390: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:27:41 -0400 (0:00:01.775) 0:00:26.100 ****** 11044 1726853261.72464: entering _queue_task() for managed_node1/package_facts 11044 1726853261.72704: worker is 1 (out of 1 available) 11044 1726853261.72715: exiting _queue_task() for managed_node1/package_facts 11044 1726853261.72729: done queuing things up, now waiting for results queue to drain 11044 1726853261.72730: waiting for pending results... 11044 1726853261.72908: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 11044 1726853261.73019: in run() - task 02083763-bbaf-c5a6-f857-00000000049a 11044 1726853261.73031: variable 'ansible_search_path' from source: unknown 11044 1726853261.73035: variable 'ansible_search_path' from source: unknown 11044 1726853261.73063: calling self._execute() 11044 1726853261.73138: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853261.73142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853261.73151: variable 'omit' from source: magic vars 11044 1726853261.73420: variable 'ansible_distribution_major_version' from source: facts 11044 1726853261.73431: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853261.73440: variable 'omit' from source: magic vars 11044 1726853261.73495: variable 'omit' from source: magic vars 11044 1726853261.73519: variable 'omit' from source: magic vars 11044 1726853261.73556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853261.73586: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853261.73602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853261.73616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853261.73626: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853261.73654: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853261.73658: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853261.73660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853261.73729: Set connection var ansible_timeout to 10 11044 1726853261.73736: Set connection var ansible_shell_executable to /bin/sh 11044 1726853261.73738: Set connection var ansible_shell_type to sh 11044 1726853261.73743: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853261.73752: Set connection var ansible_connection to ssh 11044 1726853261.73755: Set connection var ansible_pipelining to False 11044 1726853261.73777: variable 'ansible_shell_executable' from source: unknown 11044 1726853261.73780: variable 'ansible_connection' from source: unknown 11044 1726853261.73783: variable 'ansible_module_compression' from source: unknown 11044 1726853261.73785: variable 'ansible_shell_type' from source: unknown 11044 1726853261.73787: variable 'ansible_shell_executable' from source: unknown 11044 1726853261.73789: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853261.73791: variable 'ansible_pipelining' from source: unknown 11044 1726853261.73796: variable 'ansible_timeout' from source: unknown 11044 1726853261.73799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853261.73942: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11044 1726853261.73953: variable 'omit' from source: magic vars 11044 1726853261.73956: starting attempt loop 11044 1726853261.73958: running the handler 11044 1726853261.73974: _low_level_execute_command(): starting 11044 1726853261.73983: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853261.74486: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853261.74491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 11044 1726853261.74495: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853261.74548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853261.74552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853261.74554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853261.74601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853261.76235: stdout chunk (state=3): >>>/root <<< 11044 1726853261.76341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853261.76369: stderr chunk (state=3): >>><<< 11044 1726853261.76374: stdout chunk (state=3): >>><<< 11044 1726853261.76394: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853261.76404: _low_level_execute_command(): starting 11044 1726853261.76411: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853261.7639287-12303-228565400709350 `" && echo ansible-tmp-1726853261.7639287-12303-228565400709350="` echo /root/.ansible/tmp/ansible-tmp-1726853261.7639287-12303-228565400709350 `" ) && sleep 0' 11044 1726853261.76860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853261.76863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853261.76866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853261.76877: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853261.76880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853261.76927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853261.76934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853261.76936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853261.76977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853261.78847: stdout chunk (state=3): >>>ansible-tmp-1726853261.7639287-12303-228565400709350=/root/.ansible/tmp/ansible-tmp-1726853261.7639287-12303-228565400709350 <<< 11044 1726853261.78947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853261.78976: stderr chunk (state=3): >>><<< 11044 1726853261.78979: stdout chunk (state=3): >>><<< 11044 1726853261.78995: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853261.7639287-12303-228565400709350=/root/.ansible/tmp/ansible-tmp-1726853261.7639287-12303-228565400709350 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853261.79038: variable 'ansible_module_compression' from source: unknown 11044 1726853261.79083: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 11044 1726853261.79138: variable 'ansible_facts' from source: unknown 11044 1726853261.79259: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853261.7639287-12303-228565400709350/AnsiballZ_package_facts.py 11044 1726853261.79365: Sending initial data 11044 1726853261.79368: Sent initial data (162 bytes) 11044 1726853261.79829: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853261.79832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853261.79834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853261.79837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853261.79839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853261.79892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853261.79896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853261.79900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853261.79936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853261.81461: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11044 1726853261.81466: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853261.81493: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853261.81533: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmp_ro9r3at /root/.ansible/tmp/ansible-tmp-1726853261.7639287-12303-228565400709350/AnsiballZ_package_facts.py <<< 11044 1726853261.81539: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853261.7639287-12303-228565400709350/AnsiballZ_package_facts.py" <<< 11044 1726853261.81581: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmp_ro9r3at" to remote "/root/.ansible/tmp/ansible-tmp-1726853261.7639287-12303-228565400709350/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853261.7639287-12303-228565400709350/AnsiballZ_package_facts.py" <<< 11044 1726853261.82602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853261.82642: stderr chunk (state=3): >>><<< 11044 1726853261.82646: stdout chunk (state=3): >>><<< 11044 1726853261.82688: done transferring module to remote 11044 1726853261.82697: _low_level_execute_command(): starting 11044 1726853261.82701: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853261.7639287-12303-228565400709350/ /root/.ansible/tmp/ansible-tmp-1726853261.7639287-12303-228565400709350/AnsiballZ_package_facts.py && sleep 0' 11044 1726853261.83137: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853261.83140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853261.83143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11044 1726853261.83148: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853261.83154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853261.83198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853261.83201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853261.83245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853261.84965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853261.84992: stderr chunk (state=3): >>><<< 11044 1726853261.84996: stdout chunk (state=3): >>><<< 11044 1726853261.85009: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853261.85012: _low_level_execute_command(): starting 11044 1726853261.85017: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853261.7639287-12303-228565400709350/AnsiballZ_package_facts.py && sleep 0' 11044 1726853261.85436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853261.85473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853261.85477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853261.85479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 11044 1726853261.85482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853261.85484: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853261.85531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853261.85540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853261.85542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853261.85581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853262.29380: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 11044 1726853262.29490: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 11044 1726853262.29567: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 11044 1726853262.29619: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11044 1726853262.31332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853262.31363: stderr chunk (state=3): >>><<< 11044 1726853262.31366: stdout chunk (state=3): >>><<< 11044 1726853262.31400: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853262.33382: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853261.7639287-12303-228565400709350/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853262.33387: _low_level_execute_command(): starting 11044 1726853262.33390: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853261.7639287-12303-228565400709350/ > /dev/null 2>&1 && sleep 0' 11044 1726853262.34025: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853262.34031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853262.34034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853262.34036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853262.34039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853262.34041: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853262.34043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853262.34076: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853262.34079: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 11044 1726853262.34082: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11044 1726853262.34084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853262.34086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853262.34131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853262.34134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853262.34136: stderr chunk (state=3): >>>debug2: match found <<< 11044 1726853262.34138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853262.34178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853262.34230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853262.34234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853262.34278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853262.36159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853262.36216: stderr chunk (state=3): >>><<< 11044 1726853262.36232: stdout chunk (state=3): >>><<< 11044 1726853262.36261: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853262.36276: handler run complete 11044 1726853262.37243: variable 'ansible_facts' from source: unknown 11044 1726853262.37740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853262.39761: variable 'ansible_facts' from source: unknown 11044 1726853262.40228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853262.41005: attempt loop complete, returning result 11044 1726853262.41024: _execute() done 11044 1726853262.41038: dumping result to json 11044 1726853262.41476: done dumping result, returning 11044 1726853262.41479: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-c5a6-f857-00000000049a] 11044 1726853262.41482: sending task result for task 02083763-bbaf-c5a6-f857-00000000049a 11044 1726853262.43733: done sending task result for task 02083763-bbaf-c5a6-f857-00000000049a 11044 1726853262.43737: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11044 1726853262.43904: no more pending results, returning what we have 11044 1726853262.43911: results queue empty 11044 1726853262.43913: checking for any_errors_fatal 11044 1726853262.43917: done checking for any_errors_fatal 11044 1726853262.43917: checking for max_fail_percentage 11044 1726853262.43918: done checking for max_fail_percentage 11044 1726853262.43919: checking to see if all hosts have failed and the running result is not ok 11044 1726853262.43919: done checking to see if all hosts have failed 11044 1726853262.43920: getting the remaining hosts for this loop 11044 1726853262.43921: done getting the remaining hosts for this loop 11044 1726853262.43923: getting the next task for host managed_node1 11044 1726853262.43928: done getting next task for host managed_node1 11044 1726853262.43931: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11044 1726853262.43934: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853262.43941: getting variables 11044 1726853262.43942: in VariableManager get_vars() 11044 1726853262.43967: Calling all_inventory to load vars for managed_node1 11044 1726853262.43968: Calling groups_inventory to load vars for managed_node1 11044 1726853262.43970: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853262.43986: Calling all_plugins_play to load vars for managed_node1 11044 1726853262.43993: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853262.43995: Calling groups_plugins_play to load vars for managed_node1 11044 1726853262.44685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853262.45543: done with get_vars() 11044 1726853262.45562: done getting variables 11044 1726853262.45625: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:27:42 -0400 (0:00:00.731) 0:00:26.832 ****** 11044 1726853262.45656: entering _queue_task() for managed_node1/debug 11044 1726853262.45993: worker is 1 (out of 1 available) 11044 1726853262.46010: exiting _queue_task() for managed_node1/debug 11044 1726853262.46024: done queuing things up, now waiting for results queue to drain 11044 1726853262.46025: waiting for pending results... 11044 1726853262.46357: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 11044 1726853262.46441: in run() - task 02083763-bbaf-c5a6-f857-00000000007e 11044 1726853262.46453: variable 'ansible_search_path' from source: unknown 11044 1726853262.46456: variable 'ansible_search_path' from source: unknown 11044 1726853262.46495: calling self._execute() 11044 1726853262.46676: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853262.46679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853262.46682: variable 'omit' from source: magic vars 11044 1726853262.46949: variable 'ansible_distribution_major_version' from source: facts 11044 1726853262.46967: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853262.46981: variable 'omit' from source: magic vars 11044 1726853262.47050: variable 'omit' from source: magic vars 11044 1726853262.47147: variable 'network_provider' from source: set_fact 11044 1726853262.47170: variable 'omit' from source: magic vars 11044 1726853262.47215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853262.47253: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853262.47279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853262.47299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853262.47315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853262.47348: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853262.47357: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853262.47364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853262.47467: Set connection var ansible_timeout to 10 11044 1726853262.47485: Set connection var ansible_shell_executable to /bin/sh 11044 1726853262.47493: Set connection var ansible_shell_type to sh 11044 1726853262.47675: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853262.47678: Set connection var ansible_connection to ssh 11044 1726853262.47681: Set connection var ansible_pipelining to False 11044 1726853262.47688: variable 'ansible_shell_executable' from source: unknown 11044 1726853262.47691: variable 'ansible_connection' from source: unknown 11044 1726853262.47693: variable 'ansible_module_compression' from source: unknown 11044 1726853262.47695: variable 'ansible_shell_type' from source: unknown 11044 1726853262.47697: variable 'ansible_shell_executable' from source: unknown 11044 1726853262.47699: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853262.47701: variable 'ansible_pipelining' from source: unknown 11044 1726853262.47703: variable 'ansible_timeout' from source: unknown 11044 1726853262.47704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853262.47707: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853262.47713: variable 'omit' from source: magic vars 11044 1726853262.47717: starting attempt loop 11044 1726853262.47727: running the handler 11044 1726853262.47782: handler run complete 11044 1726853262.47794: attempt loop complete, returning result 11044 1726853262.47797: _execute() done 11044 1726853262.47800: dumping result to json 11044 1726853262.47802: done dumping result, returning 11044 1726853262.47808: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-c5a6-f857-00000000007e] 11044 1726853262.47810: sending task result for task 02083763-bbaf-c5a6-f857-00000000007e ok: [managed_node1] => {} MSG: Using network provider: nm 11044 1726853262.48029: no more pending results, returning what we have 11044 1726853262.48032: results queue empty 11044 1726853262.48033: checking for any_errors_fatal 11044 1726853262.48040: done checking for any_errors_fatal 11044 1726853262.48041: checking for max_fail_percentage 11044 1726853262.48043: done checking for max_fail_percentage 11044 1726853262.48046: checking to see if all hosts have failed and the running result is not ok 11044 1726853262.48047: done checking to see if all hosts have failed 11044 1726853262.48047: getting the remaining hosts for this loop 11044 1726853262.48049: done getting the remaining hosts for this loop 11044 1726853262.48051: getting the next task for host managed_node1 11044 1726853262.48070: done getting next task for host managed_node1 11044 1726853262.48075: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11044 1726853262.48079: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853262.48089: done sending task result for task 02083763-bbaf-c5a6-f857-00000000007e 11044 1726853262.48091: WORKER PROCESS EXITING 11044 1726853262.48098: getting variables 11044 1726853262.48099: in VariableManager get_vars() 11044 1726853262.48139: Calling all_inventory to load vars for managed_node1 11044 1726853262.48141: Calling groups_inventory to load vars for managed_node1 11044 1726853262.48146: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853262.48154: Calling all_plugins_play to load vars for managed_node1 11044 1726853262.48156: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853262.48159: Calling groups_plugins_play to load vars for managed_node1 11044 1726853262.54162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853262.55891: done with get_vars() 11044 1726853262.55925: done getting variables 11044 1726853262.55985: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:27:42 -0400 (0:00:00.103) 0:00:26.935 ****** 11044 1726853262.56018: entering _queue_task() for managed_node1/fail 11044 1726853262.56422: worker is 1 (out of 1 available) 11044 1726853262.56434: exiting _queue_task() for managed_node1/fail 11044 1726853262.56447: done queuing things up, now waiting for results queue to drain 11044 1726853262.56449: waiting for pending results... 11044 1726853262.56987: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11044 1726853262.57060: in run() - task 02083763-bbaf-c5a6-f857-00000000007f 11044 1726853262.57087: variable 'ansible_search_path' from source: unknown 11044 1726853262.57098: variable 'ansible_search_path' from source: unknown 11044 1726853262.57139: calling self._execute() 11044 1726853262.57242: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853262.57260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853262.57280: variable 'omit' from source: magic vars 11044 1726853262.57648: variable 'ansible_distribution_major_version' from source: facts 11044 1726853262.57666: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853262.57792: variable 'network_state' from source: role '' defaults 11044 1726853262.57808: Evaluated conditional (network_state != {}): False 11044 1726853262.57978: when evaluation is False, skipping this task 11044 1726853262.57981: _execute() done 11044 1726853262.57984: dumping result to json 11044 1726853262.57987: done dumping result, returning 11044 1726853262.57990: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-c5a6-f857-00000000007f] 11044 1726853262.57993: sending task result for task 02083763-bbaf-c5a6-f857-00000000007f 11044 1726853262.58063: done sending task result for task 02083763-bbaf-c5a6-f857-00000000007f 11044 1726853262.58067: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11044 1726853262.58117: no more pending results, returning what we have 11044 1726853262.58121: results queue empty 11044 1726853262.58121: checking for any_errors_fatal 11044 1726853262.58130: done checking for any_errors_fatal 11044 1726853262.58130: checking for max_fail_percentage 11044 1726853262.58132: done checking for max_fail_percentage 11044 1726853262.58133: checking to see if all hosts have failed and the running result is not ok 11044 1726853262.58134: done checking to see if all hosts have failed 11044 1726853262.58135: getting the remaining hosts for this loop 11044 1726853262.58136: done getting the remaining hosts for this loop 11044 1726853262.58140: getting the next task for host managed_node1 11044 1726853262.58145: done getting next task for host managed_node1 11044 1726853262.58149: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11044 1726853262.58153: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853262.58169: getting variables 11044 1726853262.58173: in VariableManager get_vars() 11044 1726853262.58298: Calling all_inventory to load vars for managed_node1 11044 1726853262.58301: Calling groups_inventory to load vars for managed_node1 11044 1726853262.58303: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853262.58313: Calling all_plugins_play to load vars for managed_node1 11044 1726853262.58316: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853262.58319: Calling groups_plugins_play to load vars for managed_node1 11044 1726853262.59634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853262.61187: done with get_vars() 11044 1726853262.61206: done getting variables 11044 1726853262.61276: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:27:42 -0400 (0:00:00.052) 0:00:26.988 ****** 11044 1726853262.61303: entering _queue_task() for managed_node1/fail 11044 1726853262.61562: worker is 1 (out of 1 available) 11044 1726853262.61577: exiting _queue_task() for managed_node1/fail 11044 1726853262.61591: done queuing things up, now waiting for results queue to drain 11044 1726853262.61592: waiting for pending results... 11044 1726853262.61775: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11044 1726853262.61872: in run() - task 02083763-bbaf-c5a6-f857-000000000080 11044 1726853262.61882: variable 'ansible_search_path' from source: unknown 11044 1726853262.61886: variable 'ansible_search_path' from source: unknown 11044 1726853262.61915: calling self._execute() 11044 1726853262.61993: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853262.61998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853262.62007: variable 'omit' from source: magic vars 11044 1726853262.62294: variable 'ansible_distribution_major_version' from source: facts 11044 1726853262.62303: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853262.62389: variable 'network_state' from source: role '' defaults 11044 1726853262.62397: Evaluated conditional (network_state != {}): False 11044 1726853262.62400: when evaluation is False, skipping this task 11044 1726853262.62402: _execute() done 11044 1726853262.62405: dumping result to json 11044 1726853262.62408: done dumping result, returning 11044 1726853262.62416: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-c5a6-f857-000000000080] 11044 1726853262.62419: sending task result for task 02083763-bbaf-c5a6-f857-000000000080 11044 1726853262.62507: done sending task result for task 02083763-bbaf-c5a6-f857-000000000080 11044 1726853262.62510: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11044 1726853262.62557: no more pending results, returning what we have 11044 1726853262.62560: results queue empty 11044 1726853262.62561: checking for any_errors_fatal 11044 1726853262.62566: done checking for any_errors_fatal 11044 1726853262.62567: checking for max_fail_percentage 11044 1726853262.62569: done checking for max_fail_percentage 11044 1726853262.62570: checking to see if all hosts have failed and the running result is not ok 11044 1726853262.62573: done checking to see if all hosts have failed 11044 1726853262.62574: getting the remaining hosts for this loop 11044 1726853262.62575: done getting the remaining hosts for this loop 11044 1726853262.62578: getting the next task for host managed_node1 11044 1726853262.62584: done getting next task for host managed_node1 11044 1726853262.62588: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11044 1726853262.62591: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853262.62611: getting variables 11044 1726853262.62612: in VariableManager get_vars() 11044 1726853262.62658: Calling all_inventory to load vars for managed_node1 11044 1726853262.62660: Calling groups_inventory to load vars for managed_node1 11044 1726853262.62662: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853262.62672: Calling all_plugins_play to load vars for managed_node1 11044 1726853262.62675: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853262.62678: Calling groups_plugins_play to load vars for managed_node1 11044 1726853262.63922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853262.64865: done with get_vars() 11044 1726853262.64883: done getting variables 11044 1726853262.64925: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:27:42 -0400 (0:00:00.036) 0:00:27.025 ****** 11044 1726853262.64951: entering _queue_task() for managed_node1/fail 11044 1726853262.65189: worker is 1 (out of 1 available) 11044 1726853262.65202: exiting _queue_task() for managed_node1/fail 11044 1726853262.65214: done queuing things up, now waiting for results queue to drain 11044 1726853262.65216: waiting for pending results... 11044 1726853262.65400: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11044 1726853262.65502: in run() - task 02083763-bbaf-c5a6-f857-000000000081 11044 1726853262.65513: variable 'ansible_search_path' from source: unknown 11044 1726853262.65516: variable 'ansible_search_path' from source: unknown 11044 1726853262.65545: calling self._execute() 11044 1726853262.65621: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853262.65626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853262.65634: variable 'omit' from source: magic vars 11044 1726853262.65912: variable 'ansible_distribution_major_version' from source: facts 11044 1726853262.65921: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853262.66040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853262.68379: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853262.68383: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853262.68386: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853262.68388: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853262.68413: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853262.68506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853262.68594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853262.68628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853262.68677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853262.68696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853262.68837: variable 'ansible_distribution_major_version' from source: facts 11044 1726853262.68861: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11044 1726853262.68985: variable 'ansible_distribution' from source: facts 11044 1726853262.68994: variable '__network_rh_distros' from source: role '' defaults 11044 1726853262.69007: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11044 1726853262.69515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853262.69548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853262.69777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853262.69781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853262.69783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853262.69978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853262.69982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853262.69984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853262.70111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853262.70130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853262.70212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853262.70273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853262.70408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853262.70466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853262.70489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853262.70908: variable 'network_connections' from source: task vars 11044 1726853262.70925: variable 'port2_profile' from source: play vars 11044 1726853262.70999: variable 'port2_profile' from source: play vars 11044 1726853262.71016: variable 'port1_profile' from source: play vars 11044 1726853262.71084: variable 'port1_profile' from source: play vars 11044 1726853262.71097: variable 'controller_profile' from source: play vars 11044 1726853262.71160: variable 'controller_profile' from source: play vars 11044 1726853262.71175: variable 'network_state' from source: role '' defaults 11044 1726853262.71247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853262.71441: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853262.71490: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853262.71524: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853262.71560: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853262.71620: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853262.71677: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853262.71682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853262.71712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853262.71749: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11044 1726853262.71877: when evaluation is False, skipping this task 11044 1726853262.71880: _execute() done 11044 1726853262.71882: dumping result to json 11044 1726853262.71884: done dumping result, returning 11044 1726853262.71887: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-c5a6-f857-000000000081] 11044 1726853262.71889: sending task result for task 02083763-bbaf-c5a6-f857-000000000081 11044 1726853262.71962: done sending task result for task 02083763-bbaf-c5a6-f857-000000000081 11044 1726853262.71965: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11044 1726853262.72018: no more pending results, returning what we have 11044 1726853262.72021: results queue empty 11044 1726853262.72022: checking for any_errors_fatal 11044 1726853262.72028: done checking for any_errors_fatal 11044 1726853262.72028: checking for max_fail_percentage 11044 1726853262.72030: done checking for max_fail_percentage 11044 1726853262.72031: checking to see if all hosts have failed and the running result is not ok 11044 1726853262.72031: done checking to see if all hosts have failed 11044 1726853262.72032: getting the remaining hosts for this loop 11044 1726853262.72033: done getting the remaining hosts for this loop 11044 1726853262.72037: getting the next task for host managed_node1 11044 1726853262.72044: done getting next task for host managed_node1 11044 1726853262.72047: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11044 1726853262.72051: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853262.72068: getting variables 11044 1726853262.72069: in VariableManager get_vars() 11044 1726853262.72114: Calling all_inventory to load vars for managed_node1 11044 1726853262.72116: Calling groups_inventory to load vars for managed_node1 11044 1726853262.72119: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853262.72128: Calling all_plugins_play to load vars for managed_node1 11044 1726853262.72131: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853262.72134: Calling groups_plugins_play to load vars for managed_node1 11044 1726853262.75225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853262.78453: done with get_vars() 11044 1726853262.78686: done getting variables 11044 1726853262.78751: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:27:42 -0400 (0:00:00.138) 0:00:27.163 ****** 11044 1726853262.78788: entering _queue_task() for managed_node1/dnf 11044 1726853262.79558: worker is 1 (out of 1 available) 11044 1726853262.79572: exiting _queue_task() for managed_node1/dnf 11044 1726853262.79586: done queuing things up, now waiting for results queue to drain 11044 1726853262.79588: waiting for pending results... 11044 1726853262.80215: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11044 1726853262.80428: in run() - task 02083763-bbaf-c5a6-f857-000000000082 11044 1726853262.80450: variable 'ansible_search_path' from source: unknown 11044 1726853262.80457: variable 'ansible_search_path' from source: unknown 11044 1726853262.80577: calling self._execute() 11044 1726853262.80709: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853262.80964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853262.80967: variable 'omit' from source: magic vars 11044 1726853262.81565: variable 'ansible_distribution_major_version' from source: facts 11044 1726853262.81585: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853262.82146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853262.87030: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853262.87251: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853262.87317: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853262.87415: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853262.87521: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853262.87776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853262.87779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853262.87782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853262.87853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853262.87944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853262.88178: variable 'ansible_distribution' from source: facts 11044 1726853262.88188: variable 'ansible_distribution_major_version' from source: facts 11044 1726853262.88208: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11044 1726853262.88442: variable '__network_wireless_connections_defined' from source: role '' defaults 11044 1726853262.88905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853262.88908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853262.88911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853262.88913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853262.88915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853262.89083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853262.89111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853262.89143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853262.89268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853262.89289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853262.89332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853262.89557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853262.89561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853262.89563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853262.89565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853262.89917: variable 'network_connections' from source: task vars 11044 1726853262.89934: variable 'port2_profile' from source: play vars 11044 1726853262.90176: variable 'port2_profile' from source: play vars 11044 1726853262.90179: variable 'port1_profile' from source: play vars 11044 1726853262.90210: variable 'port1_profile' from source: play vars 11044 1726853262.90223: variable 'controller_profile' from source: play vars 11044 1726853262.90336: variable 'controller_profile' from source: play vars 11044 1726853262.90489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853262.90877: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853262.91013: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853262.91050: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853262.91106: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853262.91340: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853262.91353: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853262.91355: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853262.91559: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853262.91562: variable '__network_team_connections_defined' from source: role '' defaults 11044 1726853262.91951: variable 'network_connections' from source: task vars 11044 1726853262.92000: variable 'port2_profile' from source: play vars 11044 1726853262.92055: variable 'port2_profile' from source: play vars 11044 1726853262.92157: variable 'port1_profile' from source: play vars 11044 1726853262.92292: variable 'port1_profile' from source: play vars 11044 1726853262.92305: variable 'controller_profile' from source: play vars 11044 1726853262.92373: variable 'controller_profile' from source: play vars 11044 1726853262.92458: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11044 1726853262.92576: when evaluation is False, skipping this task 11044 1726853262.92579: _execute() done 11044 1726853262.92581: dumping result to json 11044 1726853262.92584: done dumping result, returning 11044 1726853262.92587: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-c5a6-f857-000000000082] 11044 1726853262.92590: sending task result for task 02083763-bbaf-c5a6-f857-000000000082 11044 1726853262.92868: done sending task result for task 02083763-bbaf-c5a6-f857-000000000082 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11044 1726853262.92924: no more pending results, returning what we have 11044 1726853262.92927: results queue empty 11044 1726853262.92928: checking for any_errors_fatal 11044 1726853262.92935: done checking for any_errors_fatal 11044 1726853262.92935: checking for max_fail_percentage 11044 1726853262.92937: done checking for max_fail_percentage 11044 1726853262.92938: checking to see if all hosts have failed and the running result is not ok 11044 1726853262.92939: done checking to see if all hosts have failed 11044 1726853262.92940: getting the remaining hosts for this loop 11044 1726853262.92941: done getting the remaining hosts for this loop 11044 1726853262.92947: getting the next task for host managed_node1 11044 1726853262.92955: done getting next task for host managed_node1 11044 1726853262.92959: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11044 1726853262.92963: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853262.92983: getting variables 11044 1726853262.92985: in VariableManager get_vars() 11044 1726853262.93025: Calling all_inventory to load vars for managed_node1 11044 1726853262.93027: Calling groups_inventory to load vars for managed_node1 11044 1726853262.93030: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853262.93040: Calling all_plugins_play to load vars for managed_node1 11044 1726853262.93043: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853262.93048: Calling groups_plugins_play to load vars for managed_node1 11044 1726853262.93685: WORKER PROCESS EXITING 11044 1726853262.96029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853262.97847: done with get_vars() 11044 1726853262.97879: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11044 1726853262.97963: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:27:42 -0400 (0:00:00.192) 0:00:27.355 ****** 11044 1726853262.97999: entering _queue_task() for managed_node1/yum 11044 1726853262.98573: worker is 1 (out of 1 available) 11044 1726853262.98582: exiting _queue_task() for managed_node1/yum 11044 1726853262.98592: done queuing things up, now waiting for results queue to drain 11044 1726853262.98593: waiting for pending results... 11044 1726853262.98717: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11044 1726853262.98986: in run() - task 02083763-bbaf-c5a6-f857-000000000083 11044 1726853262.99007: variable 'ansible_search_path' from source: unknown 11044 1726853262.99016: variable 'ansible_search_path' from source: unknown 11044 1726853262.99069: calling self._execute() 11044 1726853262.99206: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853262.99287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853262.99303: variable 'omit' from source: magic vars 11044 1726853263.00138: variable 'ansible_distribution_major_version' from source: facts 11044 1726853263.00161: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853263.00564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853263.03328: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853263.03431: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853263.03511: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853263.03533: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853263.03568: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853263.03696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853263.03704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853263.03739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.03930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853263.03954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853263.04077: variable 'ansible_distribution_major_version' from source: facts 11044 1726853263.04104: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11044 1726853263.04113: when evaluation is False, skipping this task 11044 1726853263.04121: _execute() done 11044 1726853263.04278: dumping result to json 11044 1726853263.04281: done dumping result, returning 11044 1726853263.04284: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-c5a6-f857-000000000083] 11044 1726853263.04287: sending task result for task 02083763-bbaf-c5a6-f857-000000000083 11044 1726853263.04364: done sending task result for task 02083763-bbaf-c5a6-f857-000000000083 11044 1726853263.04367: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11044 1726853263.04419: no more pending results, returning what we have 11044 1726853263.04423: results queue empty 11044 1726853263.04424: checking for any_errors_fatal 11044 1726853263.04430: done checking for any_errors_fatal 11044 1726853263.04431: checking for max_fail_percentage 11044 1726853263.04433: done checking for max_fail_percentage 11044 1726853263.04434: checking to see if all hosts have failed and the running result is not ok 11044 1726853263.04435: done checking to see if all hosts have failed 11044 1726853263.04436: getting the remaining hosts for this loop 11044 1726853263.04437: done getting the remaining hosts for this loop 11044 1726853263.04442: getting the next task for host managed_node1 11044 1726853263.04452: done getting next task for host managed_node1 11044 1726853263.04455: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11044 1726853263.04460: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853263.04484: getting variables 11044 1726853263.04486: in VariableManager get_vars() 11044 1726853263.04530: Calling all_inventory to load vars for managed_node1 11044 1726853263.04533: Calling groups_inventory to load vars for managed_node1 11044 1726853263.04535: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853263.04549: Calling all_plugins_play to load vars for managed_node1 11044 1726853263.04553: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853263.04556: Calling groups_plugins_play to load vars for managed_node1 11044 1726853263.06317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853263.08713: done with get_vars() 11044 1726853263.08738: done getting variables 11044 1726853263.08809: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:27:43 -0400 (0:00:00.108) 0:00:27.464 ****** 11044 1726853263.08850: entering _queue_task() for managed_node1/fail 11044 1726853263.09219: worker is 1 (out of 1 available) 11044 1726853263.09232: exiting _queue_task() for managed_node1/fail 11044 1726853263.09250: done queuing things up, now waiting for results queue to drain 11044 1726853263.09251: waiting for pending results... 11044 1726853263.09592: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11044 1726853263.09695: in run() - task 02083763-bbaf-c5a6-f857-000000000084 11044 1726853263.09706: variable 'ansible_search_path' from source: unknown 11044 1726853263.09710: variable 'ansible_search_path' from source: unknown 11044 1726853263.09740: calling self._execute() 11044 1726853263.09818: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853263.09823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853263.09832: variable 'omit' from source: magic vars 11044 1726853263.10194: variable 'ansible_distribution_major_version' from source: facts 11044 1726853263.10258: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853263.10382: variable '__network_wireless_connections_defined' from source: role '' defaults 11044 1726853263.10520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853263.13125: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853263.13339: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853263.13389: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853263.13484: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853263.13587: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853263.13775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853263.13799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853263.13826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.13885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853263.13899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853263.13947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853263.13975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853263.14000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.14039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853263.14059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853263.14102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853263.14126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853263.14154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.14194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853263.14209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853263.14403: variable 'network_connections' from source: task vars 11044 1726853263.14424: variable 'port2_profile' from source: play vars 11044 1726853263.14493: variable 'port2_profile' from source: play vars 11044 1726853263.14536: variable 'port1_profile' from source: play vars 11044 1726853263.14569: variable 'port1_profile' from source: play vars 11044 1726853263.14579: variable 'controller_profile' from source: play vars 11044 1726853263.14643: variable 'controller_profile' from source: play vars 11044 1726853263.14714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853263.14910: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853263.14958: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853263.14990: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853263.15017: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853263.15060: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853263.15085: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853263.15157: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.15160: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853263.15195: variable '__network_team_connections_defined' from source: role '' defaults 11044 1726853263.15504: variable 'network_connections' from source: task vars 11044 1726853263.15516: variable 'port2_profile' from source: play vars 11044 1726853263.15577: variable 'port2_profile' from source: play vars 11044 1726853263.15581: variable 'port1_profile' from source: play vars 11044 1726853263.15736: variable 'port1_profile' from source: play vars 11044 1726853263.15742: variable 'controller_profile' from source: play vars 11044 1726853263.15893: variable 'controller_profile' from source: play vars 11044 1726853263.15920: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11044 1726853263.15931: when evaluation is False, skipping this task 11044 1726853263.15934: _execute() done 11044 1726853263.15937: dumping result to json 11044 1726853263.15939: done dumping result, returning 11044 1726853263.15942: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-c5a6-f857-000000000084] 11044 1726853263.15944: sending task result for task 02083763-bbaf-c5a6-f857-000000000084 11044 1726853263.16243: done sending task result for task 02083763-bbaf-c5a6-f857-000000000084 11044 1726853263.16246: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11044 1726853263.16320: no more pending results, returning what we have 11044 1726853263.16323: results queue empty 11044 1726853263.16324: checking for any_errors_fatal 11044 1726853263.16329: done checking for any_errors_fatal 11044 1726853263.16329: checking for max_fail_percentage 11044 1726853263.16331: done checking for max_fail_percentage 11044 1726853263.16332: checking to see if all hosts have failed and the running result is not ok 11044 1726853263.16332: done checking to see if all hosts have failed 11044 1726853263.16333: getting the remaining hosts for this loop 11044 1726853263.16334: done getting the remaining hosts for this loop 11044 1726853263.16337: getting the next task for host managed_node1 11044 1726853263.16343: done getting next task for host managed_node1 11044 1726853263.16346: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11044 1726853263.16351: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853263.16587: getting variables 11044 1726853263.16589: in VariableManager get_vars() 11044 1726853263.16628: Calling all_inventory to load vars for managed_node1 11044 1726853263.16631: Calling groups_inventory to load vars for managed_node1 11044 1726853263.16633: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853263.16641: Calling all_plugins_play to load vars for managed_node1 11044 1726853263.16644: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853263.16647: Calling groups_plugins_play to load vars for managed_node1 11044 1726853263.18110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853263.19449: done with get_vars() 11044 1726853263.19472: done getting variables 11044 1726853263.19520: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:27:43 -0400 (0:00:00.107) 0:00:27.571 ****** 11044 1726853263.19550: entering _queue_task() for managed_node1/package 11044 1726853263.19811: worker is 1 (out of 1 available) 11044 1726853263.19825: exiting _queue_task() for managed_node1/package 11044 1726853263.19839: done queuing things up, now waiting for results queue to drain 11044 1726853263.19841: waiting for pending results... 11044 1726853263.20015: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 11044 1726853263.20110: in run() - task 02083763-bbaf-c5a6-f857-000000000085 11044 1726853263.20120: variable 'ansible_search_path' from source: unknown 11044 1726853263.20124: variable 'ansible_search_path' from source: unknown 11044 1726853263.20153: calling self._execute() 11044 1726853263.20234: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853263.20239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853263.20249: variable 'omit' from source: magic vars 11044 1726853263.20521: variable 'ansible_distribution_major_version' from source: facts 11044 1726853263.20530: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853263.20664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853263.20854: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853263.20888: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853263.20912: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853263.20967: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853263.21049: variable 'network_packages' from source: role '' defaults 11044 1726853263.21122: variable '__network_provider_setup' from source: role '' defaults 11044 1726853263.21132: variable '__network_service_name_default_nm' from source: role '' defaults 11044 1726853263.21179: variable '__network_service_name_default_nm' from source: role '' defaults 11044 1726853263.21186: variable '__network_packages_default_nm' from source: role '' defaults 11044 1726853263.21228: variable '__network_packages_default_nm' from source: role '' defaults 11044 1726853263.21348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853263.22893: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853263.22935: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853263.22962: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853263.22986: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853263.23004: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853263.23068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853263.23090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853263.23108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.23135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853263.23149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853263.23180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853263.23195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853263.23211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.23236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853263.23249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853263.23388: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11044 1726853263.23472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853263.23489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853263.23505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.23529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853263.23539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853263.23602: variable 'ansible_python' from source: facts 11044 1726853263.23621: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11044 1726853263.23678: variable '__network_wpa_supplicant_required' from source: role '' defaults 11044 1726853263.23732: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11044 1726853263.23816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853263.23832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853263.23849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.23875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853263.23885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853263.23920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853263.23938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853263.23955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.23981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853263.23991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853263.24085: variable 'network_connections' from source: task vars 11044 1726853263.24089: variable 'port2_profile' from source: play vars 11044 1726853263.24160: variable 'port2_profile' from source: play vars 11044 1726853263.24169: variable 'port1_profile' from source: play vars 11044 1726853263.24239: variable 'port1_profile' from source: play vars 11044 1726853263.24249: variable 'controller_profile' from source: play vars 11044 1726853263.24314: variable 'controller_profile' from source: play vars 11044 1726853263.24365: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853263.24386: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853263.24405: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.24425: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853263.24466: variable '__network_wireless_connections_defined' from source: role '' defaults 11044 1726853263.24639: variable 'network_connections' from source: task vars 11044 1726853263.24642: variable 'port2_profile' from source: play vars 11044 1726853263.24713: variable 'port2_profile' from source: play vars 11044 1726853263.24721: variable 'port1_profile' from source: play vars 11044 1726853263.24791: variable 'port1_profile' from source: play vars 11044 1726853263.24799: variable 'controller_profile' from source: play vars 11044 1726853263.24865: variable 'controller_profile' from source: play vars 11044 1726853263.24892: variable '__network_packages_default_wireless' from source: role '' defaults 11044 1726853263.24948: variable '__network_wireless_connections_defined' from source: role '' defaults 11044 1726853263.25139: variable 'network_connections' from source: task vars 11044 1726853263.25143: variable 'port2_profile' from source: play vars 11044 1726853263.25189: variable 'port2_profile' from source: play vars 11044 1726853263.25195: variable 'port1_profile' from source: play vars 11044 1726853263.25239: variable 'port1_profile' from source: play vars 11044 1726853263.25248: variable 'controller_profile' from source: play vars 11044 1726853263.25291: variable 'controller_profile' from source: play vars 11044 1726853263.25314: variable '__network_packages_default_team' from source: role '' defaults 11044 1726853263.25363: variable '__network_team_connections_defined' from source: role '' defaults 11044 1726853263.25555: variable 'network_connections' from source: task vars 11044 1726853263.25559: variable 'port2_profile' from source: play vars 11044 1726853263.25604: variable 'port2_profile' from source: play vars 11044 1726853263.25610: variable 'port1_profile' from source: play vars 11044 1726853263.25657: variable 'port1_profile' from source: play vars 11044 1726853263.25663: variable 'controller_profile' from source: play vars 11044 1726853263.25708: variable 'controller_profile' from source: play vars 11044 1726853263.25750: variable '__network_service_name_default_initscripts' from source: role '' defaults 11044 1726853263.25790: variable '__network_service_name_default_initscripts' from source: role '' defaults 11044 1726853263.25795: variable '__network_packages_default_initscripts' from source: role '' defaults 11044 1726853263.25836: variable '__network_packages_default_initscripts' from source: role '' defaults 11044 1726853263.25972: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11044 1726853263.26263: variable 'network_connections' from source: task vars 11044 1726853263.26267: variable 'port2_profile' from source: play vars 11044 1726853263.26313: variable 'port2_profile' from source: play vars 11044 1726853263.26319: variable 'port1_profile' from source: play vars 11044 1726853263.26359: variable 'port1_profile' from source: play vars 11044 1726853263.26365: variable 'controller_profile' from source: play vars 11044 1726853263.26410: variable 'controller_profile' from source: play vars 11044 1726853263.26417: variable 'ansible_distribution' from source: facts 11044 1726853263.26421: variable '__network_rh_distros' from source: role '' defaults 11044 1726853263.26427: variable 'ansible_distribution_major_version' from source: facts 11044 1726853263.26438: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11044 1726853263.26547: variable 'ansible_distribution' from source: facts 11044 1726853263.26550: variable '__network_rh_distros' from source: role '' defaults 11044 1726853263.26553: variable 'ansible_distribution_major_version' from source: facts 11044 1726853263.26563: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11044 1726853263.26668: variable 'ansible_distribution' from source: facts 11044 1726853263.26673: variable '__network_rh_distros' from source: role '' defaults 11044 1726853263.26676: variable 'ansible_distribution_major_version' from source: facts 11044 1726853263.26702: variable 'network_provider' from source: set_fact 11044 1726853263.26719: variable 'ansible_facts' from source: unknown 11044 1726853263.27078: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11044 1726853263.27082: when evaluation is False, skipping this task 11044 1726853263.27085: _execute() done 11044 1726853263.27087: dumping result to json 11044 1726853263.27089: done dumping result, returning 11044 1726853263.27097: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-c5a6-f857-000000000085] 11044 1726853263.27099: sending task result for task 02083763-bbaf-c5a6-f857-000000000085 skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11044 1726853263.27247: no more pending results, returning what we have 11044 1726853263.27250: results queue empty 11044 1726853263.27251: checking for any_errors_fatal 11044 1726853263.27258: done checking for any_errors_fatal 11044 1726853263.27259: checking for max_fail_percentage 11044 1726853263.27260: done checking for max_fail_percentage 11044 1726853263.27261: checking to see if all hosts have failed and the running result is not ok 11044 1726853263.27262: done checking to see if all hosts have failed 11044 1726853263.27262: getting the remaining hosts for this loop 11044 1726853263.27263: done getting the remaining hosts for this loop 11044 1726853263.27267: getting the next task for host managed_node1 11044 1726853263.27275: done getting next task for host managed_node1 11044 1726853263.27285: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11044 1726853263.27289: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853263.27306: getting variables 11044 1726853263.27308: in VariableManager get_vars() 11044 1726853263.27348: Calling all_inventory to load vars for managed_node1 11044 1726853263.27351: Calling groups_inventory to load vars for managed_node1 11044 1726853263.27353: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853263.27362: Calling all_plugins_play to load vars for managed_node1 11044 1726853263.27365: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853263.27368: Calling groups_plugins_play to load vars for managed_node1 11044 1726853263.27384: done sending task result for task 02083763-bbaf-c5a6-f857-000000000085 11044 1726853263.27386: WORKER PROCESS EXITING 11044 1726853263.28328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853263.29203: done with get_vars() 11044 1726853263.29218: done getting variables 11044 1726853263.29264: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:27:43 -0400 (0:00:00.097) 0:00:27.668 ****** 11044 1726853263.29290: entering _queue_task() for managed_node1/package 11044 1726853263.29532: worker is 1 (out of 1 available) 11044 1726853263.29550: exiting _queue_task() for managed_node1/package 11044 1726853263.29561: done queuing things up, now waiting for results queue to drain 11044 1726853263.29563: waiting for pending results... 11044 1726853263.29746: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11044 1726853263.29852: in run() - task 02083763-bbaf-c5a6-f857-000000000086 11044 1726853263.29863: variable 'ansible_search_path' from source: unknown 11044 1726853263.29866: variable 'ansible_search_path' from source: unknown 11044 1726853263.29896: calling self._execute() 11044 1726853263.29976: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853263.29982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853263.29990: variable 'omit' from source: magic vars 11044 1726853263.30476: variable 'ansible_distribution_major_version' from source: facts 11044 1726853263.30479: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853263.30482: variable 'network_state' from source: role '' defaults 11044 1726853263.30485: Evaluated conditional (network_state != {}): False 11044 1726853263.30488: when evaluation is False, skipping this task 11044 1726853263.30492: _execute() done 11044 1726853263.30500: dumping result to json 11044 1726853263.30510: done dumping result, returning 11044 1726853263.30524: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-c5a6-f857-000000000086] 11044 1726853263.30534: sending task result for task 02083763-bbaf-c5a6-f857-000000000086 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11044 1726853263.30693: no more pending results, returning what we have 11044 1726853263.30715: results queue empty 11044 1726853263.30716: checking for any_errors_fatal 11044 1726853263.30723: done checking for any_errors_fatal 11044 1726853263.30723: checking for max_fail_percentage 11044 1726853263.30725: done checking for max_fail_percentage 11044 1726853263.30726: checking to see if all hosts have failed and the running result is not ok 11044 1726853263.30727: done checking to see if all hosts have failed 11044 1726853263.30727: getting the remaining hosts for this loop 11044 1726853263.30729: done getting the remaining hosts for this loop 11044 1726853263.30732: getting the next task for host managed_node1 11044 1726853263.30739: done getting next task for host managed_node1 11044 1726853263.30742: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11044 1726853263.30746: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853263.30765: done sending task result for task 02083763-bbaf-c5a6-f857-000000000086 11044 1726853263.30768: WORKER PROCESS EXITING 11044 1726853263.30819: getting variables 11044 1726853263.30919: in VariableManager get_vars() 11044 1726853263.30954: Calling all_inventory to load vars for managed_node1 11044 1726853263.30956: Calling groups_inventory to load vars for managed_node1 11044 1726853263.30959: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853263.30967: Calling all_plugins_play to load vars for managed_node1 11044 1726853263.30970: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853263.30975: Calling groups_plugins_play to load vars for managed_node1 11044 1726853263.31886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853263.32739: done with get_vars() 11044 1726853263.32755: done getting variables 11044 1726853263.32797: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:27:43 -0400 (0:00:00.035) 0:00:27.703 ****** 11044 1726853263.32824: entering _queue_task() for managed_node1/package 11044 1726853263.33048: worker is 1 (out of 1 available) 11044 1726853263.33061: exiting _queue_task() for managed_node1/package 11044 1726853263.33075: done queuing things up, now waiting for results queue to drain 11044 1726853263.33077: waiting for pending results... 11044 1726853263.33269: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11044 1726853263.33685: in run() - task 02083763-bbaf-c5a6-f857-000000000087 11044 1726853263.33689: variable 'ansible_search_path' from source: unknown 11044 1726853263.33692: variable 'ansible_search_path' from source: unknown 11044 1726853263.33694: calling self._execute() 11044 1726853263.33696: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853263.33699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853263.33702: variable 'omit' from source: magic vars 11044 1726853263.34185: variable 'ansible_distribution_major_version' from source: facts 11044 1726853263.34196: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853263.34322: variable 'network_state' from source: role '' defaults 11044 1726853263.34331: Evaluated conditional (network_state != {}): False 11044 1726853263.34335: when evaluation is False, skipping this task 11044 1726853263.34338: _execute() done 11044 1726853263.34340: dumping result to json 11044 1726853263.34343: done dumping result, returning 11044 1726853263.34360: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-c5a6-f857-000000000087] 11044 1726853263.34363: sending task result for task 02083763-bbaf-c5a6-f857-000000000087 11044 1726853263.34462: done sending task result for task 02083763-bbaf-c5a6-f857-000000000087 11044 1726853263.34576: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11044 1726853263.34612: no more pending results, returning what we have 11044 1726853263.34615: results queue empty 11044 1726853263.34616: checking for any_errors_fatal 11044 1726853263.34619: done checking for any_errors_fatal 11044 1726853263.34620: checking for max_fail_percentage 11044 1726853263.34622: done checking for max_fail_percentage 11044 1726853263.34623: checking to see if all hosts have failed and the running result is not ok 11044 1726853263.34624: done checking to see if all hosts have failed 11044 1726853263.34624: getting the remaining hosts for this loop 11044 1726853263.34626: done getting the remaining hosts for this loop 11044 1726853263.34628: getting the next task for host managed_node1 11044 1726853263.34634: done getting next task for host managed_node1 11044 1726853263.34637: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11044 1726853263.34641: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853263.34658: getting variables 11044 1726853263.34659: in VariableManager get_vars() 11044 1726853263.34696: Calling all_inventory to load vars for managed_node1 11044 1726853263.34698: Calling groups_inventory to load vars for managed_node1 11044 1726853263.34700: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853263.34709: Calling all_plugins_play to load vars for managed_node1 11044 1726853263.34712: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853263.34715: Calling groups_plugins_play to load vars for managed_node1 11044 1726853263.36130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853263.37679: done with get_vars() 11044 1726853263.37707: done getting variables 11044 1726853263.37775: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:27:43 -0400 (0:00:00.049) 0:00:27.753 ****** 11044 1726853263.37813: entering _queue_task() for managed_node1/service 11044 1726853263.38402: worker is 1 (out of 1 available) 11044 1726853263.38412: exiting _queue_task() for managed_node1/service 11044 1726853263.38422: done queuing things up, now waiting for results queue to drain 11044 1726853263.38423: waiting for pending results... 11044 1726853263.38522: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11044 1726853263.38689: in run() - task 02083763-bbaf-c5a6-f857-000000000088 11044 1726853263.38709: variable 'ansible_search_path' from source: unknown 11044 1726853263.38719: variable 'ansible_search_path' from source: unknown 11044 1726853263.38764: calling self._execute() 11044 1726853263.38875: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853263.38888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853263.38903: variable 'omit' from source: magic vars 11044 1726853263.39284: variable 'ansible_distribution_major_version' from source: facts 11044 1726853263.39305: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853263.39430: variable '__network_wireless_connections_defined' from source: role '' defaults 11044 1726853263.39638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853263.41903: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853263.41956: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853263.42002: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853263.42046: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853263.42120: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853263.42164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853263.42200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853263.42233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.42280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853263.42298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853263.42352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853263.42382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853263.42446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.42457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853263.42477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853263.42519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853263.42549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853263.42676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.42679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853263.42681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853263.42814: variable 'network_connections' from source: task vars 11044 1726853263.42830: variable 'port2_profile' from source: play vars 11044 1726853263.42905: variable 'port2_profile' from source: play vars 11044 1726853263.42920: variable 'port1_profile' from source: play vars 11044 1726853263.42985: variable 'port1_profile' from source: play vars 11044 1726853263.42998: variable 'controller_profile' from source: play vars 11044 1726853263.43066: variable 'controller_profile' from source: play vars 11044 1726853263.43146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853263.43332: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853263.43381: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853263.43414: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853263.43453: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853263.43502: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853263.43559: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853263.43563: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.43593: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853263.43656: variable '__network_team_connections_defined' from source: role '' defaults 11044 1726853263.43866: variable 'network_connections' from source: task vars 11044 1726853263.44072: variable 'port2_profile' from source: play vars 11044 1726853263.44078: variable 'port2_profile' from source: play vars 11044 1726853263.44080: variable 'port1_profile' from source: play vars 11044 1726853263.44082: variable 'port1_profile' from source: play vars 11044 1726853263.44086: variable 'controller_profile' from source: play vars 11044 1726853263.44093: variable 'controller_profile' from source: play vars 11044 1726853263.44124: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11044 1726853263.44142: when evaluation is False, skipping this task 11044 1726853263.44154: _execute() done 11044 1726853263.44161: dumping result to json 11044 1726853263.44168: done dumping result, returning 11044 1726853263.44183: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-c5a6-f857-000000000088] 11044 1726853263.44192: sending task result for task 02083763-bbaf-c5a6-f857-000000000088 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11044 1726853263.44359: no more pending results, returning what we have 11044 1726853263.44362: results queue empty 11044 1726853263.44364: checking for any_errors_fatal 11044 1726853263.44372: done checking for any_errors_fatal 11044 1726853263.44373: checking for max_fail_percentage 11044 1726853263.44375: done checking for max_fail_percentage 11044 1726853263.44375: checking to see if all hosts have failed and the running result is not ok 11044 1726853263.44376: done checking to see if all hosts have failed 11044 1726853263.44377: getting the remaining hosts for this loop 11044 1726853263.44378: done getting the remaining hosts for this loop 11044 1726853263.44382: getting the next task for host managed_node1 11044 1726853263.44390: done getting next task for host managed_node1 11044 1726853263.44394: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11044 1726853263.44399: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853263.44419: getting variables 11044 1726853263.44421: in VariableManager get_vars() 11044 1726853263.44574: Calling all_inventory to load vars for managed_node1 11044 1726853263.44578: Calling groups_inventory to load vars for managed_node1 11044 1726853263.44581: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853263.44587: done sending task result for task 02083763-bbaf-c5a6-f857-000000000088 11044 1726853263.44590: WORKER PROCESS EXITING 11044 1726853263.44601: Calling all_plugins_play to load vars for managed_node1 11044 1726853263.44605: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853263.44609: Calling groups_plugins_play to load vars for managed_node1 11044 1726853263.46183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853263.47851: done with get_vars() 11044 1726853263.47874: done getting variables 11044 1726853263.47934: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:27:43 -0400 (0:00:00.101) 0:00:27.855 ****** 11044 1726853263.47969: entering _queue_task() for managed_node1/service 11044 1726853263.48302: worker is 1 (out of 1 available) 11044 1726853263.48314: exiting _queue_task() for managed_node1/service 11044 1726853263.48326: done queuing things up, now waiting for results queue to drain 11044 1726853263.48328: waiting for pending results... 11044 1726853263.48612: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11044 1726853263.48756: in run() - task 02083763-bbaf-c5a6-f857-000000000089 11044 1726853263.48777: variable 'ansible_search_path' from source: unknown 11044 1726853263.48785: variable 'ansible_search_path' from source: unknown 11044 1726853263.48825: calling self._execute() 11044 1726853263.48932: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853263.48943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853263.48961: variable 'omit' from source: magic vars 11044 1726853263.49332: variable 'ansible_distribution_major_version' from source: facts 11044 1726853263.49355: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853263.49525: variable 'network_provider' from source: set_fact 11044 1726853263.49559: variable 'network_state' from source: role '' defaults 11044 1726853263.49562: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11044 1726853263.49565: variable 'omit' from source: magic vars 11044 1726853263.49627: variable 'omit' from source: magic vars 11044 1726853263.49667: variable 'network_service_name' from source: role '' defaults 11044 1726853263.49777: variable 'network_service_name' from source: role '' defaults 11044 1726853263.49843: variable '__network_provider_setup' from source: role '' defaults 11044 1726853263.49858: variable '__network_service_name_default_nm' from source: role '' defaults 11044 1726853263.49929: variable '__network_service_name_default_nm' from source: role '' defaults 11044 1726853263.49946: variable '__network_packages_default_nm' from source: role '' defaults 11044 1726853263.50015: variable '__network_packages_default_nm' from source: role '' defaults 11044 1726853263.50476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853263.52459: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853263.52539: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853263.52588: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853263.52628: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853263.52664: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853263.52748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853263.52790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853263.52821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.52873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853263.52894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853263.52942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853263.53076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853263.53081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.53089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853263.53092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853263.53313: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11044 1726853263.53439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853263.53470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853263.53501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.53549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853263.53569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853263.53673: variable 'ansible_python' from source: facts 11044 1726853263.53700: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11044 1726853263.53789: variable '__network_wpa_supplicant_required' from source: role '' defaults 11044 1726853263.53876: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11044 1726853263.54003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853263.54032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853263.54066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.54110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853263.54128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853263.54278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853263.54290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853263.54292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.54295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853263.54410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853263.54453: variable 'network_connections' from source: task vars 11044 1726853263.54466: variable 'port2_profile' from source: play vars 11044 1726853263.54551: variable 'port2_profile' from source: play vars 11044 1726853263.54569: variable 'port1_profile' from source: play vars 11044 1726853263.54654: variable 'port1_profile' from source: play vars 11044 1726853263.54673: variable 'controller_profile' from source: play vars 11044 1726853263.54756: variable 'controller_profile' from source: play vars 11044 1726853263.54873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853263.55099: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853263.55157: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853263.55211: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853263.55256: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853263.55322: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853263.55355: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853263.55395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853263.55431: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853263.55487: variable '__network_wireless_connections_defined' from source: role '' defaults 11044 1726853263.55824: variable 'network_connections' from source: task vars 11044 1726853263.55828: variable 'port2_profile' from source: play vars 11044 1726853263.55836: variable 'port2_profile' from source: play vars 11044 1726853263.55856: variable 'port1_profile' from source: play vars 11044 1726853263.55937: variable 'port1_profile' from source: play vars 11044 1726853263.55960: variable 'controller_profile' from source: play vars 11044 1726853263.56042: variable 'controller_profile' from source: play vars 11044 1726853263.56091: variable '__network_packages_default_wireless' from source: role '' defaults 11044 1726853263.56174: variable '__network_wireless_connections_defined' from source: role '' defaults 11044 1726853263.56439: variable 'network_connections' from source: task vars 11044 1726853263.56452: variable 'port2_profile' from source: play vars 11044 1726853263.56522: variable 'port2_profile' from source: play vars 11044 1726853263.56588: variable 'port1_profile' from source: play vars 11044 1726853263.56607: variable 'port1_profile' from source: play vars 11044 1726853263.56619: variable 'controller_profile' from source: play vars 11044 1726853263.56687: variable 'controller_profile' from source: play vars 11044 1726853263.56717: variable '__network_packages_default_team' from source: role '' defaults 11044 1726853263.56797: variable '__network_team_connections_defined' from source: role '' defaults 11044 1726853263.57089: variable 'network_connections' from source: task vars 11044 1726853263.57099: variable 'port2_profile' from source: play vars 11044 1726853263.57175: variable 'port2_profile' from source: play vars 11044 1726853263.57188: variable 'port1_profile' from source: play vars 11044 1726853263.57353: variable 'port1_profile' from source: play vars 11044 1726853263.57356: variable 'controller_profile' from source: play vars 11044 1726853263.57359: variable 'controller_profile' from source: play vars 11044 1726853263.57407: variable '__network_service_name_default_initscripts' from source: role '' defaults 11044 1726853263.57481: variable '__network_service_name_default_initscripts' from source: role '' defaults 11044 1726853263.57494: variable '__network_packages_default_initscripts' from source: role '' defaults 11044 1726853263.57554: variable '__network_packages_default_initscripts' from source: role '' defaults 11044 1726853263.57800: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11044 1726853263.58366: variable 'network_connections' from source: task vars 11044 1726853263.58380: variable 'port2_profile' from source: play vars 11044 1726853263.58447: variable 'port2_profile' from source: play vars 11044 1726853263.58460: variable 'port1_profile' from source: play vars 11044 1726853263.58522: variable 'port1_profile' from source: play vars 11044 1726853263.58534: variable 'controller_profile' from source: play vars 11044 1726853263.58604: variable 'controller_profile' from source: play vars 11044 1726853263.58616: variable 'ansible_distribution' from source: facts 11044 1726853263.58625: variable '__network_rh_distros' from source: role '' defaults 11044 1726853263.58634: variable 'ansible_distribution_major_version' from source: facts 11044 1726853263.58658: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11044 1726853263.58876: variable 'ansible_distribution' from source: facts 11044 1726853263.58880: variable '__network_rh_distros' from source: role '' defaults 11044 1726853263.58882: variable 'ansible_distribution_major_version' from source: facts 11044 1726853263.58885: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11044 1726853263.59048: variable 'ansible_distribution' from source: facts 11044 1726853263.59058: variable '__network_rh_distros' from source: role '' defaults 11044 1726853263.59067: variable 'ansible_distribution_major_version' from source: facts 11044 1726853263.59113: variable 'network_provider' from source: set_fact 11044 1726853263.59140: variable 'omit' from source: magic vars 11044 1726853263.59178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853263.59212: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853263.59237: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853263.59260: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853263.59277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853263.59307: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853263.59315: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853263.59326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853263.59434: Set connection var ansible_timeout to 10 11044 1726853263.59451: Set connection var ansible_shell_executable to /bin/sh 11044 1726853263.59458: Set connection var ansible_shell_type to sh 11044 1726853263.59467: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853263.59477: Set connection var ansible_connection to ssh 11044 1726853263.59542: Set connection var ansible_pipelining to False 11044 1726853263.59548: variable 'ansible_shell_executable' from source: unknown 11044 1726853263.59550: variable 'ansible_connection' from source: unknown 11044 1726853263.59552: variable 'ansible_module_compression' from source: unknown 11044 1726853263.59554: variable 'ansible_shell_type' from source: unknown 11044 1726853263.59556: variable 'ansible_shell_executable' from source: unknown 11044 1726853263.59558: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853263.59559: variable 'ansible_pipelining' from source: unknown 11044 1726853263.59561: variable 'ansible_timeout' from source: unknown 11044 1726853263.59563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853263.59662: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853263.59679: variable 'omit' from source: magic vars 11044 1726853263.59689: starting attempt loop 11044 1726853263.59694: running the handler 11044 1726853263.59778: variable 'ansible_facts' from source: unknown 11044 1726853263.60554: _low_level_execute_command(): starting 11044 1726853263.60624: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853263.61295: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853263.61316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853263.61392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853263.61416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853263.61436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853263.61463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853263.61543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853263.63218: stdout chunk (state=3): >>>/root <<< 11044 1726853263.63377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853263.63381: stdout chunk (state=3): >>><<< 11044 1726853263.63383: stderr chunk (state=3): >>><<< 11044 1726853263.63497: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853263.63503: _low_level_execute_command(): starting 11044 1726853263.63506: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853263.6340501-12368-273414097149506 `" && echo ansible-tmp-1726853263.6340501-12368-273414097149506="` echo /root/.ansible/tmp/ansible-tmp-1726853263.6340501-12368-273414097149506 `" ) && sleep 0' 11044 1726853263.64438: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853263.64442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853263.64448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853263.64451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853263.64453: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853263.64455: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853263.64458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853263.64548: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853263.64552: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853263.64580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853263.64664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853263.66587: stdout chunk (state=3): >>>ansible-tmp-1726853263.6340501-12368-273414097149506=/root/.ansible/tmp/ansible-tmp-1726853263.6340501-12368-273414097149506 <<< 11044 1726853263.66690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853263.66731: stderr chunk (state=3): >>><<< 11044 1726853263.66743: stdout chunk (state=3): >>><<< 11044 1726853263.66860: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853263.6340501-12368-273414097149506=/root/.ansible/tmp/ansible-tmp-1726853263.6340501-12368-273414097149506 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853263.66896: variable 'ansible_module_compression' from source: unknown 11044 1726853263.67076: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11044 1726853263.67123: variable 'ansible_facts' from source: unknown 11044 1726853263.67561: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853263.6340501-12368-273414097149506/AnsiballZ_systemd.py 11044 1726853263.68020: Sending initial data 11044 1726853263.68024: Sent initial data (156 bytes) 11044 1726853263.69038: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853263.69055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853263.69069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853263.69090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853263.69365: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853263.69383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853263.69449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853263.71124: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853263.71153: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853263.71190: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpbzweng3t /root/.ansible/tmp/ansible-tmp-1726853263.6340501-12368-273414097149506/AnsiballZ_systemd.py <<< 11044 1726853263.71194: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853263.6340501-12368-273414097149506/AnsiballZ_systemd.py" <<< 11044 1726853263.71237: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpbzweng3t" to remote "/root/.ansible/tmp/ansible-tmp-1726853263.6340501-12368-273414097149506/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853263.6340501-12368-273414097149506/AnsiballZ_systemd.py" <<< 11044 1726853263.73802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853263.73963: stderr chunk (state=3): >>><<< 11044 1726853263.73967: stdout chunk (state=3): >>><<< 11044 1726853263.73970: done transferring module to remote 11044 1726853263.73974: _low_level_execute_command(): starting 11044 1726853263.73977: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853263.6340501-12368-273414097149506/ /root/.ansible/tmp/ansible-tmp-1726853263.6340501-12368-273414097149506/AnsiballZ_systemd.py && sleep 0' 11044 1726853263.75159: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853263.75304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853263.75348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853263.75388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853263.77232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853263.77235: stdout chunk (state=3): >>><<< 11044 1726853263.77238: stderr chunk (state=3): >>><<< 11044 1726853263.77252: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853263.77262: _low_level_execute_command(): starting 11044 1726853263.77274: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853263.6340501-12368-273414097149506/AnsiballZ_systemd.py && sleep 0' 11044 1726853263.78327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853263.78331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853263.78333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853263.78335: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 11044 1726853263.78341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853263.78401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853263.78411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853263.78441: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853263.78509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853264.07223: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10543104", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3323629568", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "509216000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 11044 1726853264.07263: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11044 1726853264.09113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853264.09123: stdout chunk (state=3): >>><<< 11044 1726853264.09139: stderr chunk (state=3): >>><<< 11044 1726853264.09425: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10543104", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3323629568", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "509216000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853264.09628: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853263.6340501-12368-273414097149506/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853264.09658: _low_level_execute_command(): starting 11044 1726853264.09668: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853263.6340501-12368-273414097149506/ > /dev/null 2>&1 && sleep 0' 11044 1726853264.10338: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853264.10360: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853264.10387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853264.10408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853264.10477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853264.10521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853264.10538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853264.10564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853264.10656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853264.12484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853264.12548: stderr chunk (state=3): >>><<< 11044 1726853264.12559: stdout chunk (state=3): >>><<< 11044 1726853264.12582: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853264.12598: handler run complete 11044 1726853264.12670: attempt loop complete, returning result 11044 1726853264.12682: _execute() done 11044 1726853264.12689: dumping result to json 11044 1726853264.12747: done dumping result, returning 11044 1726853264.12763: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-c5a6-f857-000000000089] 11044 1726853264.12817: sending task result for task 02083763-bbaf-c5a6-f857-000000000089 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11044 1726853264.13505: no more pending results, returning what we have 11044 1726853264.13508: results queue empty 11044 1726853264.13510: checking for any_errors_fatal 11044 1726853264.13518: done checking for any_errors_fatal 11044 1726853264.13519: checking for max_fail_percentage 11044 1726853264.13521: done checking for max_fail_percentage 11044 1726853264.13522: checking to see if all hosts have failed and the running result is not ok 11044 1726853264.13522: done checking to see if all hosts have failed 11044 1726853264.13523: getting the remaining hosts for this loop 11044 1726853264.13525: done getting the remaining hosts for this loop 11044 1726853264.13528: getting the next task for host managed_node1 11044 1726853264.13535: done getting next task for host managed_node1 11044 1726853264.13539: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11044 1726853264.13546: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853264.13559: getting variables 11044 1726853264.13561: in VariableManager get_vars() 11044 1726853264.13603: Calling all_inventory to load vars for managed_node1 11044 1726853264.13606: Calling groups_inventory to load vars for managed_node1 11044 1726853264.13608: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853264.13779: Calling all_plugins_play to load vars for managed_node1 11044 1726853264.13784: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853264.13787: Calling groups_plugins_play to load vars for managed_node1 11044 1726853264.14735: done sending task result for task 02083763-bbaf-c5a6-f857-000000000089 11044 1726853264.14738: WORKER PROCESS EXITING 11044 1726853264.15603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853264.17164: done with get_vars() 11044 1726853264.17192: done getting variables 11044 1726853264.17253: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:27:44 -0400 (0:00:00.693) 0:00:28.548 ****** 11044 1726853264.17296: entering _queue_task() for managed_node1/service 11044 1726853264.17720: worker is 1 (out of 1 available) 11044 1726853264.17733: exiting _queue_task() for managed_node1/service 11044 1726853264.17748: done queuing things up, now waiting for results queue to drain 11044 1726853264.17749: waiting for pending results... 11044 1726853264.18399: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11044 1726853264.18626: in run() - task 02083763-bbaf-c5a6-f857-00000000008a 11044 1726853264.18648: variable 'ansible_search_path' from source: unknown 11044 1726853264.18659: variable 'ansible_search_path' from source: unknown 11044 1726853264.18700: calling self._execute() 11044 1726853264.18933: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853264.18948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853264.19077: variable 'omit' from source: magic vars 11044 1726853264.19792: variable 'ansible_distribution_major_version' from source: facts 11044 1726853264.19812: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853264.20176: variable 'network_provider' from source: set_fact 11044 1726853264.20179: Evaluated conditional (network_provider == "nm"): True 11044 1726853264.20285: variable '__network_wpa_supplicant_required' from source: role '' defaults 11044 1726853264.20563: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11044 1726853264.20977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853264.26523: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853264.26708: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853264.26758: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853264.26819: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853264.26909: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853264.27139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853264.27185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853264.27330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853264.27367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853264.27467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853264.27594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853264.27703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853264.27707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853264.27747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853264.27831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853264.27933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853264.28030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853264.28055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853264.28136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853264.28361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853264.28502: variable 'network_connections' from source: task vars 11044 1726853264.28592: variable 'port2_profile' from source: play vars 11044 1726853264.28717: variable 'port2_profile' from source: play vars 11044 1726853264.28763: variable 'port1_profile' from source: play vars 11044 1726853264.28894: variable 'port1_profile' from source: play vars 11044 1726853264.28909: variable 'controller_profile' from source: play vars 11044 1726853264.29120: variable 'controller_profile' from source: play vars 11044 1726853264.29295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11044 1726853264.29716: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11044 1726853264.29819: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11044 1726853264.29948: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11044 1726853264.30027: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11044 1726853264.30132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11044 1726853264.30215: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11044 1726853264.30341: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853264.30362: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11044 1726853264.30532: variable '__network_wireless_connections_defined' from source: role '' defaults 11044 1726853264.31116: variable 'network_connections' from source: task vars 11044 1726853264.31126: variable 'port2_profile' from source: play vars 11044 1726853264.31415: variable 'port2_profile' from source: play vars 11044 1726853264.31418: variable 'port1_profile' from source: play vars 11044 1726853264.31421: variable 'port1_profile' from source: play vars 11044 1726853264.31432: variable 'controller_profile' from source: play vars 11044 1726853264.31596: variable 'controller_profile' from source: play vars 11044 1726853264.31643: Evaluated conditional (__network_wpa_supplicant_required): False 11044 1726853264.31685: when evaluation is False, skipping this task 11044 1726853264.31695: _execute() done 11044 1726853264.31702: dumping result to json 11044 1726853264.31708: done dumping result, returning 11044 1726853264.31718: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-c5a6-f857-00000000008a] 11044 1726853264.31725: sending task result for task 02083763-bbaf-c5a6-f857-00000000008a 11044 1726853264.31955: done sending task result for task 02083763-bbaf-c5a6-f857-00000000008a 11044 1726853264.31958: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11044 1726853264.32009: no more pending results, returning what we have 11044 1726853264.32013: results queue empty 11044 1726853264.32014: checking for any_errors_fatal 11044 1726853264.32036: done checking for any_errors_fatal 11044 1726853264.32037: checking for max_fail_percentage 11044 1726853264.32039: done checking for max_fail_percentage 11044 1726853264.32040: checking to see if all hosts have failed and the running result is not ok 11044 1726853264.32040: done checking to see if all hosts have failed 11044 1726853264.32041: getting the remaining hosts for this loop 11044 1726853264.32043: done getting the remaining hosts for this loop 11044 1726853264.32049: getting the next task for host managed_node1 11044 1726853264.32061: done getting next task for host managed_node1 11044 1726853264.32066: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11044 1726853264.32070: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853264.32090: getting variables 11044 1726853264.32092: in VariableManager get_vars() 11044 1726853264.32134: Calling all_inventory to load vars for managed_node1 11044 1726853264.32137: Calling groups_inventory to load vars for managed_node1 11044 1726853264.32140: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853264.32154: Calling all_plugins_play to load vars for managed_node1 11044 1726853264.32158: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853264.32161: Calling groups_plugins_play to load vars for managed_node1 11044 1726853264.34626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853264.36441: done with get_vars() 11044 1726853264.36473: done getting variables 11044 1726853264.36531: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:27:44 -0400 (0:00:00.192) 0:00:28.741 ****** 11044 1726853264.36570: entering _queue_task() for managed_node1/service 11044 1726853264.36912: worker is 1 (out of 1 available) 11044 1726853264.36926: exiting _queue_task() for managed_node1/service 11044 1726853264.36938: done queuing things up, now waiting for results queue to drain 11044 1726853264.36940: waiting for pending results... 11044 1726853264.37238: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 11044 1726853264.37791: in run() - task 02083763-bbaf-c5a6-f857-00000000008b 11044 1726853264.37900: variable 'ansible_search_path' from source: unknown 11044 1726853264.37903: variable 'ansible_search_path' from source: unknown 11044 1726853264.37906: calling self._execute() 11044 1726853264.37966: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853264.37980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853264.38080: variable 'omit' from source: magic vars 11044 1726853264.38616: variable 'ansible_distribution_major_version' from source: facts 11044 1726853264.38634: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853264.38891: variable 'network_provider' from source: set_fact 11044 1726853264.38902: Evaluated conditional (network_provider == "initscripts"): False 11044 1726853264.39081: when evaluation is False, skipping this task 11044 1726853264.39084: _execute() done 11044 1726853264.39087: dumping result to json 11044 1726853264.39090: done dumping result, returning 11044 1726853264.39093: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-c5a6-f857-00000000008b] 11044 1726853264.39095: sending task result for task 02083763-bbaf-c5a6-f857-00000000008b 11044 1726853264.39164: done sending task result for task 02083763-bbaf-c5a6-f857-00000000008b 11044 1726853264.39167: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11044 1726853264.39215: no more pending results, returning what we have 11044 1726853264.39219: results queue empty 11044 1726853264.39221: checking for any_errors_fatal 11044 1726853264.39232: done checking for any_errors_fatal 11044 1726853264.39233: checking for max_fail_percentage 11044 1726853264.39235: done checking for max_fail_percentage 11044 1726853264.39236: checking to see if all hosts have failed and the running result is not ok 11044 1726853264.39237: done checking to see if all hosts have failed 11044 1726853264.39238: getting the remaining hosts for this loop 11044 1726853264.39239: done getting the remaining hosts for this loop 11044 1726853264.39243: getting the next task for host managed_node1 11044 1726853264.39252: done getting next task for host managed_node1 11044 1726853264.39257: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11044 1726853264.39262: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853264.39285: getting variables 11044 1726853264.39287: in VariableManager get_vars() 11044 1726853264.39330: Calling all_inventory to load vars for managed_node1 11044 1726853264.39333: Calling groups_inventory to load vars for managed_node1 11044 1726853264.39335: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853264.39350: Calling all_plugins_play to load vars for managed_node1 11044 1726853264.39354: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853264.39358: Calling groups_plugins_play to load vars for managed_node1 11044 1726853264.41464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853264.43650: done with get_vars() 11044 1726853264.43692: done getting variables 11044 1726853264.43774: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:27:44 -0400 (0:00:00.074) 0:00:28.815 ****** 11044 1726853264.43978: entering _queue_task() for managed_node1/copy 11044 1726853264.44449: worker is 1 (out of 1 available) 11044 1726853264.44463: exiting _queue_task() for managed_node1/copy 11044 1726853264.44479: done queuing things up, now waiting for results queue to drain 11044 1726853264.44481: waiting for pending results... 11044 1726853264.44892: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11044 1726853264.44941: in run() - task 02083763-bbaf-c5a6-f857-00000000008c 11044 1726853264.44963: variable 'ansible_search_path' from source: unknown 11044 1726853264.44974: variable 'ansible_search_path' from source: unknown 11044 1726853264.45017: calling self._execute() 11044 1726853264.45132: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853264.45144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853264.45157: variable 'omit' from source: magic vars 11044 1726853264.45655: variable 'ansible_distribution_major_version' from source: facts 11044 1726853264.45712: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853264.45857: variable 'network_provider' from source: set_fact 11044 1726853264.45868: Evaluated conditional (network_provider == "initscripts"): False 11044 1726853264.45882: when evaluation is False, skipping this task 11044 1726853264.45890: _execute() done 11044 1726853264.45897: dumping result to json 11044 1726853264.45976: done dumping result, returning 11044 1726853264.45982: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-c5a6-f857-00000000008c] 11044 1726853264.45986: sending task result for task 02083763-bbaf-c5a6-f857-00000000008c 11044 1726853264.46068: done sending task result for task 02083763-bbaf-c5a6-f857-00000000008c 11044 1726853264.46073: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11044 1726853264.46127: no more pending results, returning what we have 11044 1726853264.46131: results queue empty 11044 1726853264.46133: checking for any_errors_fatal 11044 1726853264.46139: done checking for any_errors_fatal 11044 1726853264.46140: checking for max_fail_percentage 11044 1726853264.46142: done checking for max_fail_percentage 11044 1726853264.46143: checking to see if all hosts have failed and the running result is not ok 11044 1726853264.46144: done checking to see if all hosts have failed 11044 1726853264.46145: getting the remaining hosts for this loop 11044 1726853264.46146: done getting the remaining hosts for this loop 11044 1726853264.46150: getting the next task for host managed_node1 11044 1726853264.46158: done getting next task for host managed_node1 11044 1726853264.46162: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11044 1726853264.46167: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853264.46191: getting variables 11044 1726853264.46193: in VariableManager get_vars() 11044 1726853264.46240: Calling all_inventory to load vars for managed_node1 11044 1726853264.46243: Calling groups_inventory to load vars for managed_node1 11044 1726853264.46245: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853264.46258: Calling all_plugins_play to load vars for managed_node1 11044 1726853264.46261: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853264.46263: Calling groups_plugins_play to load vars for managed_node1 11044 1726853264.48031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853264.49751: done with get_vars() 11044 1726853264.49779: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:27:44 -0400 (0:00:00.059) 0:00:28.874 ****** 11044 1726853264.49882: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 11044 1726853264.50250: worker is 1 (out of 1 available) 11044 1726853264.50262: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 11044 1726853264.50278: done queuing things up, now waiting for results queue to drain 11044 1726853264.50279: waiting for pending results... 11044 1726853264.50813: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11044 1726853264.50818: in run() - task 02083763-bbaf-c5a6-f857-00000000008d 11044 1726853264.50820: variable 'ansible_search_path' from source: unknown 11044 1726853264.50823: variable 'ansible_search_path' from source: unknown 11044 1726853264.50826: calling self._execute() 11044 1726853264.50864: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853264.50872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853264.50888: variable 'omit' from source: magic vars 11044 1726853264.51269: variable 'ansible_distribution_major_version' from source: facts 11044 1726853264.51288: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853264.51299: variable 'omit' from source: magic vars 11044 1726853264.51377: variable 'omit' from source: magic vars 11044 1726853264.51540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11044 1726853264.53709: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11044 1726853264.53828: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11044 1726853264.53832: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11044 1726853264.53867: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11044 1726853264.53904: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11044 1726853264.53989: variable 'network_provider' from source: set_fact 11044 1726853264.54125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11044 1726853264.54183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11044 1726853264.54263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11044 1726853264.54267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11044 1726853264.54287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11044 1726853264.54364: variable 'omit' from source: magic vars 11044 1726853264.54486: variable 'omit' from source: magic vars 11044 1726853264.54595: variable 'network_connections' from source: task vars 11044 1726853264.54610: variable 'port2_profile' from source: play vars 11044 1726853264.54681: variable 'port2_profile' from source: play vars 11044 1726853264.54700: variable 'port1_profile' from source: play vars 11044 1726853264.54803: variable 'port1_profile' from source: play vars 11044 1726853264.54807: variable 'controller_profile' from source: play vars 11044 1726853264.54837: variable 'controller_profile' from source: play vars 11044 1726853264.55011: variable 'omit' from source: magic vars 11044 1726853264.55031: variable '__lsr_ansible_managed' from source: task vars 11044 1726853264.55101: variable '__lsr_ansible_managed' from source: task vars 11044 1726853264.55288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11044 1726853264.55518: Loaded config def from plugin (lookup/template) 11044 1726853264.55565: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11044 1726853264.55568: File lookup term: get_ansible_managed.j2 11044 1726853264.55572: variable 'ansible_search_path' from source: unknown 11044 1726853264.55581: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11044 1726853264.55598: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11044 1726853264.55622: variable 'ansible_search_path' from source: unknown 11044 1726853264.61775: variable 'ansible_managed' from source: unknown 11044 1726853264.62078: variable 'omit' from source: magic vars 11044 1726853264.62082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853264.62085: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853264.62087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853264.62089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853264.62091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853264.62093: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853264.62095: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853264.62103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853264.62210: Set connection var ansible_timeout to 10 11044 1726853264.62228: Set connection var ansible_shell_executable to /bin/sh 11044 1726853264.62235: Set connection var ansible_shell_type to sh 11044 1726853264.62247: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853264.62257: Set connection var ansible_connection to ssh 11044 1726853264.62267: Set connection var ansible_pipelining to False 11044 1726853264.62297: variable 'ansible_shell_executable' from source: unknown 11044 1726853264.62305: variable 'ansible_connection' from source: unknown 11044 1726853264.62315: variable 'ansible_module_compression' from source: unknown 11044 1726853264.62322: variable 'ansible_shell_type' from source: unknown 11044 1726853264.62327: variable 'ansible_shell_executable' from source: unknown 11044 1726853264.62333: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853264.62339: variable 'ansible_pipelining' from source: unknown 11044 1726853264.62347: variable 'ansible_timeout' from source: unknown 11044 1726853264.62365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853264.62533: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11044 1726853264.62536: variable 'omit' from source: magic vars 11044 1726853264.62538: starting attempt loop 11044 1726853264.62540: running the handler 11044 1726853264.62542: _low_level_execute_command(): starting 11044 1726853264.62552: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853264.63249: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853264.63264: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853264.63306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853264.63317: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853264.63388: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853264.63416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853264.63433: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853264.63458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853264.63539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853264.65230: stdout chunk (state=3): >>>/root <<< 11044 1726853264.65370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853264.65401: stdout chunk (state=3): >>><<< 11044 1726853264.65404: stderr chunk (state=3): >>><<< 11044 1726853264.65425: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853264.65443: _low_level_execute_command(): starting 11044 1726853264.65533: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853264.6543195-12413-107600295088146 `" && echo ansible-tmp-1726853264.6543195-12413-107600295088146="` echo /root/.ansible/tmp/ansible-tmp-1726853264.6543195-12413-107600295088146 `" ) && sleep 0' 11044 1726853264.66067: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853264.66072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853264.66075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 11044 1726853264.66077: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853264.66080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853264.66129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853264.66154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853264.66234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853264.68152: stdout chunk (state=3): >>>ansible-tmp-1726853264.6543195-12413-107600295088146=/root/.ansible/tmp/ansible-tmp-1726853264.6543195-12413-107600295088146 <<< 11044 1726853264.68311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853264.68315: stdout chunk (state=3): >>><<< 11044 1726853264.68317: stderr chunk (state=3): >>><<< 11044 1726853264.68334: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853264.6543195-12413-107600295088146=/root/.ansible/tmp/ansible-tmp-1726853264.6543195-12413-107600295088146 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853264.68393: variable 'ansible_module_compression' from source: unknown 11044 1726853264.68476: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 11044 1726853264.68509: variable 'ansible_facts' from source: unknown 11044 1726853264.68674: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853264.6543195-12413-107600295088146/AnsiballZ_network_connections.py 11044 1726853264.68936: Sending initial data 11044 1726853264.68940: Sent initial data (168 bytes) 11044 1726853264.69614: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853264.69678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853264.69696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853264.69728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853264.69848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853264.71414: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11044 1726853264.71418: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11044 1726853264.71438: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853264.71510: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853264.71597: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpuwsdh_7w /root/.ansible/tmp/ansible-tmp-1726853264.6543195-12413-107600295088146/AnsiballZ_network_connections.py <<< 11044 1726853264.71606: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853264.6543195-12413-107600295088146/AnsiballZ_network_connections.py" <<< 11044 1726853264.71635: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpuwsdh_7w" to remote "/root/.ansible/tmp/ansible-tmp-1726853264.6543195-12413-107600295088146/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853264.6543195-12413-107600295088146/AnsiballZ_network_connections.py" <<< 11044 1726853264.73483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853264.73487: stdout chunk (state=3): >>><<< 11044 1726853264.73489: stderr chunk (state=3): >>><<< 11044 1726853264.73491: done transferring module to remote 11044 1726853264.73493: _low_level_execute_command(): starting 11044 1726853264.73495: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853264.6543195-12413-107600295088146/ /root/.ansible/tmp/ansible-tmp-1726853264.6543195-12413-107600295088146/AnsiballZ_network_connections.py && sleep 0' 11044 1726853264.74609: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853264.74792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853264.74796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853264.74896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853264.74969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853264.76746: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853264.76868: stderr chunk (state=3): >>><<< 11044 1726853264.76873: stdout chunk (state=3): >>><<< 11044 1726853264.76890: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853264.76898: _low_level_execute_command(): starting 11044 1726853264.76908: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853264.6543195-12413-107600295088146/AnsiballZ_network_connections.py && sleep 0' 11044 1726853264.77965: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853264.78074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853264.78263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853264.78352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853265.29463: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_n556xzn9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_n556xzn9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/89e597e4-eed9-47da-b8eb-c0faec291bf7: error=unknown <<< 11044 1726853265.30794: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_n556xzn9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_n556xzn9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/8b9437bb-6e81-41e5-9306-532bc96d8ae0: error=unknown <<< 11044 1726853265.32351: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_n556xzn9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 11044 1726853265.32356: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_n556xzn9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail<<< 11044 1726853265.32480: stdout chunk (state=3): >>> ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/d9b8d035-3bc8-441e-9301-200f331b189f: error=unknown <<< 11044 1726853265.32612: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11044 1726853265.34567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853265.34706: stderr chunk (state=3): >>><<< 11044 1726853265.34710: stdout chunk (state=3): >>><<< 11044 1726853265.34894: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_n556xzn9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_n556xzn9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/89e597e4-eed9-47da-b8eb-c0faec291bf7: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_n556xzn9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_n556xzn9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/8b9437bb-6e81-41e5-9306-532bc96d8ae0: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_n556xzn9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_n556xzn9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/d9b8d035-3bc8-441e-9301-200f331b189f: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853265.34947: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853264.6543195-12413-107600295088146/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853265.34973: _low_level_execute_command(): starting 11044 1726853265.34976: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853264.6543195-12413-107600295088146/ > /dev/null 2>&1 && sleep 0' 11044 1726853265.36282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853265.36292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853265.36302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853265.36565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853265.36624: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853265.36653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853265.38638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853265.38642: stdout chunk (state=3): >>><<< 11044 1726853265.38648: stderr chunk (state=3): >>><<< 11044 1726853265.38786: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853265.38790: handler run complete 11044 1726853265.38822: attempt loop complete, returning result 11044 1726853265.38825: _execute() done 11044 1726853265.38828: dumping result to json 11044 1726853265.38836: done dumping result, returning 11044 1726853265.38846: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-c5a6-f857-00000000008d] 11044 1726853265.38848: sending task result for task 02083763-bbaf-c5a6-f857-00000000008d changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 11044 1726853265.39410: no more pending results, returning what we have 11044 1726853265.39414: results queue empty 11044 1726853265.39415: checking for any_errors_fatal 11044 1726853265.39422: done checking for any_errors_fatal 11044 1726853265.39423: checking for max_fail_percentage 11044 1726853265.39424: done checking for max_fail_percentage 11044 1726853265.39425: checking to see if all hosts have failed and the running result is not ok 11044 1726853265.39426: done checking to see if all hosts have failed 11044 1726853265.39427: getting the remaining hosts for this loop 11044 1726853265.39428: done getting the remaining hosts for this loop 11044 1726853265.39431: getting the next task for host managed_node1 11044 1726853265.39437: done getting next task for host managed_node1 11044 1726853265.39441: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11044 1726853265.39446: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853265.39457: getting variables 11044 1726853265.39458: in VariableManager get_vars() 11044 1726853265.39698: Calling all_inventory to load vars for managed_node1 11044 1726853265.39701: Calling groups_inventory to load vars for managed_node1 11044 1726853265.39704: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853265.39714: Calling all_plugins_play to load vars for managed_node1 11044 1726853265.39717: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853265.39720: Calling groups_plugins_play to load vars for managed_node1 11044 1726853265.40385: done sending task result for task 02083763-bbaf-c5a6-f857-00000000008d 11044 1726853265.40393: WORKER PROCESS EXITING 11044 1726853265.42299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853265.44293: done with get_vars() 11044 1726853265.44318: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:27:45 -0400 (0:00:00.945) 0:00:29.819 ****** 11044 1726853265.44422: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 11044 1726853265.44916: worker is 1 (out of 1 available) 11044 1726853265.44931: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 11044 1726853265.44948: done queuing things up, now waiting for results queue to drain 11044 1726853265.44950: waiting for pending results... 11044 1726853265.45256: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 11044 1726853265.45451: in run() - task 02083763-bbaf-c5a6-f857-00000000008e 11044 1726853265.45479: variable 'ansible_search_path' from source: unknown 11044 1726853265.45489: variable 'ansible_search_path' from source: unknown 11044 1726853265.45532: calling self._execute() 11044 1726853265.45706: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853265.45718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853265.45731: variable 'omit' from source: magic vars 11044 1726853265.46165: variable 'ansible_distribution_major_version' from source: facts 11044 1726853265.46183: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853265.46315: variable 'network_state' from source: role '' defaults 11044 1726853265.46338: Evaluated conditional (network_state != {}): False 11044 1726853265.46347: when evaluation is False, skipping this task 11044 1726853265.46354: _execute() done 11044 1726853265.46360: dumping result to json 11044 1726853265.46367: done dumping result, returning 11044 1726853265.46380: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-c5a6-f857-00000000008e] 11044 1726853265.46397: sending task result for task 02083763-bbaf-c5a6-f857-00000000008e skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11044 1726853265.46659: no more pending results, returning what we have 11044 1726853265.46663: results queue empty 11044 1726853265.46665: checking for any_errors_fatal 11044 1726853265.46680: done checking for any_errors_fatal 11044 1726853265.46681: checking for max_fail_percentage 11044 1726853265.46683: done checking for max_fail_percentage 11044 1726853265.46684: checking to see if all hosts have failed and the running result is not ok 11044 1726853265.46685: done checking to see if all hosts have failed 11044 1726853265.46686: getting the remaining hosts for this loop 11044 1726853265.46687: done getting the remaining hosts for this loop 11044 1726853265.46692: getting the next task for host managed_node1 11044 1726853265.46700: done getting next task for host managed_node1 11044 1726853265.46703: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11044 1726853265.46707: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853265.46729: getting variables 11044 1726853265.46731: in VariableManager get_vars() 11044 1726853265.46898: Calling all_inventory to load vars for managed_node1 11044 1726853265.46901: Calling groups_inventory to load vars for managed_node1 11044 1726853265.46904: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853265.46981: Calling all_plugins_play to load vars for managed_node1 11044 1726853265.46985: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853265.46988: Calling groups_plugins_play to load vars for managed_node1 11044 1726853265.47727: done sending task result for task 02083763-bbaf-c5a6-f857-00000000008e 11044 1726853265.47731: WORKER PROCESS EXITING 11044 1726853265.48545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853265.49400: done with get_vars() 11044 1726853265.49417: done getting variables 11044 1726853265.49463: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:27:45 -0400 (0:00:00.050) 0:00:29.870 ****** 11044 1726853265.49500: entering _queue_task() for managed_node1/debug 11044 1726853265.49877: worker is 1 (out of 1 available) 11044 1726853265.49889: exiting _queue_task() for managed_node1/debug 11044 1726853265.50027: done queuing things up, now waiting for results queue to drain 11044 1726853265.50029: waiting for pending results... 11044 1726853265.50297: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11044 1726853265.50713: in run() - task 02083763-bbaf-c5a6-f857-00000000008f 11044 1726853265.50716: variable 'ansible_search_path' from source: unknown 11044 1726853265.50718: variable 'ansible_search_path' from source: unknown 11044 1726853265.50759: calling self._execute() 11044 1726853265.50914: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853265.50926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853265.50981: variable 'omit' from source: magic vars 11044 1726853265.51451: variable 'ansible_distribution_major_version' from source: facts 11044 1726853265.51466: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853265.51487: variable 'omit' from source: magic vars 11044 1726853265.51570: variable 'omit' from source: magic vars 11044 1726853265.51614: variable 'omit' from source: magic vars 11044 1726853265.51654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853265.51687: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853265.51704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853265.51814: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853265.51821: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853265.51824: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853265.51826: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853265.51828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853265.51880: Set connection var ansible_timeout to 10 11044 1726853265.51898: Set connection var ansible_shell_executable to /bin/sh 11044 1726853265.51901: Set connection var ansible_shell_type to sh 11044 1726853265.51904: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853265.51906: Set connection var ansible_connection to ssh 11044 1726853265.51918: Set connection var ansible_pipelining to False 11044 1726853265.51935: variable 'ansible_shell_executable' from source: unknown 11044 1726853265.51943: variable 'ansible_connection' from source: unknown 11044 1726853265.51947: variable 'ansible_module_compression' from source: unknown 11044 1726853265.51949: variable 'ansible_shell_type' from source: unknown 11044 1726853265.51951: variable 'ansible_shell_executable' from source: unknown 11044 1726853265.51953: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853265.51955: variable 'ansible_pipelining' from source: unknown 11044 1726853265.51958: variable 'ansible_timeout' from source: unknown 11044 1726853265.51960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853265.52177: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853265.52182: variable 'omit' from source: magic vars 11044 1726853265.52185: starting attempt loop 11044 1726853265.52187: running the handler 11044 1726853265.52289: variable '__network_connections_result' from source: set_fact 11044 1726853265.52292: handler run complete 11044 1726853265.52304: attempt loop complete, returning result 11044 1726853265.52307: _execute() done 11044 1726853265.52310: dumping result to json 11044 1726853265.52313: done dumping result, returning 11044 1726853265.52322: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-c5a6-f857-00000000008f] 11044 1726853265.52325: sending task result for task 02083763-bbaf-c5a6-f857-00000000008f 11044 1726853265.52452: done sending task result for task 02083763-bbaf-c5a6-f857-00000000008f 11044 1726853265.52455: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 11044 1726853265.52559: no more pending results, returning what we have 11044 1726853265.52562: results queue empty 11044 1726853265.52565: checking for any_errors_fatal 11044 1726853265.52570: done checking for any_errors_fatal 11044 1726853265.52571: checking for max_fail_percentage 11044 1726853265.52573: done checking for max_fail_percentage 11044 1726853265.52574: checking to see if all hosts have failed and the running result is not ok 11044 1726853265.52575: done checking to see if all hosts have failed 11044 1726853265.52576: getting the remaining hosts for this loop 11044 1726853265.52577: done getting the remaining hosts for this loop 11044 1726853265.52580: getting the next task for host managed_node1 11044 1726853265.52586: done getting next task for host managed_node1 11044 1726853265.52589: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11044 1726853265.52592: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853265.52602: getting variables 11044 1726853265.52604: in VariableManager get_vars() 11044 1726853265.52638: Calling all_inventory to load vars for managed_node1 11044 1726853265.52641: Calling groups_inventory to load vars for managed_node1 11044 1726853265.52643: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853265.52651: Calling all_plugins_play to load vars for managed_node1 11044 1726853265.52653: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853265.52656: Calling groups_plugins_play to load vars for managed_node1 11044 1726853265.54638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853265.55518: done with get_vars() 11044 1726853265.55538: done getting variables 11044 1726853265.55588: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:27:45 -0400 (0:00:00.061) 0:00:29.932 ****** 11044 1726853265.55627: entering _queue_task() for managed_node1/debug 11044 1726853265.56028: worker is 1 (out of 1 available) 11044 1726853265.56041: exiting _queue_task() for managed_node1/debug 11044 1726853265.56054: done queuing things up, now waiting for results queue to drain 11044 1726853265.56055: waiting for pending results... 11044 1726853265.56499: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11044 1726853265.56560: in run() - task 02083763-bbaf-c5a6-f857-000000000090 11044 1726853265.56564: variable 'ansible_search_path' from source: unknown 11044 1726853265.56567: variable 'ansible_search_path' from source: unknown 11044 1726853265.56614: calling self._execute() 11044 1726853265.56776: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853265.56779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853265.56795: variable 'omit' from source: magic vars 11044 1726853265.57217: variable 'ansible_distribution_major_version' from source: facts 11044 1726853265.57319: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853265.57323: variable 'omit' from source: magic vars 11044 1726853265.57326: variable 'omit' from source: magic vars 11044 1726853265.57346: variable 'omit' from source: magic vars 11044 1726853265.57392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853265.57434: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853265.57454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853265.57475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853265.57487: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853265.57523: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853265.57527: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853265.57529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853265.57623: Set connection var ansible_timeout to 10 11044 1726853265.57650: Set connection var ansible_shell_executable to /bin/sh 11044 1726853265.57654: Set connection var ansible_shell_type to sh 11044 1726853265.57656: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853265.57659: Set connection var ansible_connection to ssh 11044 1726853265.57661: Set connection var ansible_pipelining to False 11044 1726853265.57736: variable 'ansible_shell_executable' from source: unknown 11044 1726853265.57739: variable 'ansible_connection' from source: unknown 11044 1726853265.57741: variable 'ansible_module_compression' from source: unknown 11044 1726853265.57747: variable 'ansible_shell_type' from source: unknown 11044 1726853265.57749: variable 'ansible_shell_executable' from source: unknown 11044 1726853265.57752: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853265.57754: variable 'ansible_pipelining' from source: unknown 11044 1726853265.57756: variable 'ansible_timeout' from source: unknown 11044 1726853265.57758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853265.57998: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853265.58002: variable 'omit' from source: magic vars 11044 1726853265.58005: starting attempt loop 11044 1726853265.58007: running the handler 11044 1726853265.58010: variable '__network_connections_result' from source: set_fact 11044 1726853265.58026: variable '__network_connections_result' from source: set_fact 11044 1726853265.58172: handler run complete 11044 1726853265.58204: attempt loop complete, returning result 11044 1726853265.58221: _execute() done 11044 1726853265.58227: dumping result to json 11044 1726853265.58235: done dumping result, returning 11044 1726853265.58250: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-c5a6-f857-000000000090] 11044 1726853265.58260: sending task result for task 02083763-bbaf-c5a6-f857-000000000090 ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 11044 1726853265.58633: no more pending results, returning what we have 11044 1726853265.58636: results queue empty 11044 1726853265.58637: checking for any_errors_fatal 11044 1726853265.58666: done checking for any_errors_fatal 11044 1726853265.58667: checking for max_fail_percentage 11044 1726853265.58669: done checking for max_fail_percentage 11044 1726853265.58669: checking to see if all hosts have failed and the running result is not ok 11044 1726853265.58672: done checking to see if all hosts have failed 11044 1726853265.58673: getting the remaining hosts for this loop 11044 1726853265.58674: done getting the remaining hosts for this loop 11044 1726853265.58677: getting the next task for host managed_node1 11044 1726853265.58683: done getting next task for host managed_node1 11044 1726853265.58687: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11044 1726853265.58690: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853265.58698: done sending task result for task 02083763-bbaf-c5a6-f857-000000000090 11044 1726853265.58700: WORKER PROCESS EXITING 11044 1726853265.58707: getting variables 11044 1726853265.58708: in VariableManager get_vars() 11044 1726853265.58742: Calling all_inventory to load vars for managed_node1 11044 1726853265.58747: Calling groups_inventory to load vars for managed_node1 11044 1726853265.58750: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853265.58764: Calling all_plugins_play to load vars for managed_node1 11044 1726853265.58774: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853265.58804: Calling groups_plugins_play to load vars for managed_node1 11044 1726853265.59592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853265.60455: done with get_vars() 11044 1726853265.60476: done getting variables 11044 1726853265.60522: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:27:45 -0400 (0:00:00.049) 0:00:29.981 ****** 11044 1726853265.60548: entering _queue_task() for managed_node1/debug 11044 1726853265.60888: worker is 1 (out of 1 available) 11044 1726853265.60902: exiting _queue_task() for managed_node1/debug 11044 1726853265.60915: done queuing things up, now waiting for results queue to drain 11044 1726853265.60916: waiting for pending results... 11044 1726853265.61186: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11044 1726853265.61301: in run() - task 02083763-bbaf-c5a6-f857-000000000091 11044 1726853265.61319: variable 'ansible_search_path' from source: unknown 11044 1726853265.61326: variable 'ansible_search_path' from source: unknown 11044 1726853265.61477: calling self._execute() 11044 1726853265.61483: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853265.61486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853265.61491: variable 'omit' from source: magic vars 11044 1726853265.61905: variable 'ansible_distribution_major_version' from source: facts 11044 1726853265.61915: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853265.62001: variable 'network_state' from source: role '' defaults 11044 1726853265.62010: Evaluated conditional (network_state != {}): False 11044 1726853265.62013: when evaluation is False, skipping this task 11044 1726853265.62016: _execute() done 11044 1726853265.62018: dumping result to json 11044 1726853265.62021: done dumping result, returning 11044 1726853265.62030: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-c5a6-f857-000000000091] 11044 1726853265.62040: sending task result for task 02083763-bbaf-c5a6-f857-000000000091 11044 1726853265.62127: done sending task result for task 02083763-bbaf-c5a6-f857-000000000091 11044 1726853265.62129: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 11044 1726853265.62190: no more pending results, returning what we have 11044 1726853265.62194: results queue empty 11044 1726853265.62195: checking for any_errors_fatal 11044 1726853265.62205: done checking for any_errors_fatal 11044 1726853265.62205: checking for max_fail_percentage 11044 1726853265.62207: done checking for max_fail_percentage 11044 1726853265.62207: checking to see if all hosts have failed and the running result is not ok 11044 1726853265.62208: done checking to see if all hosts have failed 11044 1726853265.62209: getting the remaining hosts for this loop 11044 1726853265.62210: done getting the remaining hosts for this loop 11044 1726853265.62213: getting the next task for host managed_node1 11044 1726853265.62221: done getting next task for host managed_node1 11044 1726853265.62224: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11044 1726853265.62228: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853265.62248: getting variables 11044 1726853265.62250: in VariableManager get_vars() 11044 1726853265.62285: Calling all_inventory to load vars for managed_node1 11044 1726853265.62288: Calling groups_inventory to load vars for managed_node1 11044 1726853265.62290: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853265.62299: Calling all_plugins_play to load vars for managed_node1 11044 1726853265.62301: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853265.62303: Calling groups_plugins_play to load vars for managed_node1 11044 1726853265.66384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853265.67219: done with get_vars() 11044 1726853265.67236: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:27:45 -0400 (0:00:00.067) 0:00:30.048 ****** 11044 1726853265.67301: entering _queue_task() for managed_node1/ping 11044 1726853265.67569: worker is 1 (out of 1 available) 11044 1726853265.67585: exiting _queue_task() for managed_node1/ping 11044 1726853265.67599: done queuing things up, now waiting for results queue to drain 11044 1726853265.67601: waiting for pending results... 11044 1726853265.67791: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 11044 1726853265.67892: in run() - task 02083763-bbaf-c5a6-f857-000000000092 11044 1726853265.67904: variable 'ansible_search_path' from source: unknown 11044 1726853265.67908: variable 'ansible_search_path' from source: unknown 11044 1726853265.67939: calling self._execute() 11044 1726853265.68015: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853265.68022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853265.68031: variable 'omit' from source: magic vars 11044 1726853265.68320: variable 'ansible_distribution_major_version' from source: facts 11044 1726853265.68330: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853265.68335: variable 'omit' from source: magic vars 11044 1726853265.68385: variable 'omit' from source: magic vars 11044 1726853265.68409: variable 'omit' from source: magic vars 11044 1726853265.68441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853265.68473: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853265.68491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853265.68506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853265.68516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853265.68540: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853265.68544: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853265.68548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853265.68623: Set connection var ansible_timeout to 10 11044 1726853265.68630: Set connection var ansible_shell_executable to /bin/sh 11044 1726853265.68633: Set connection var ansible_shell_type to sh 11044 1726853265.68639: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853265.68643: Set connection var ansible_connection to ssh 11044 1726853265.68651: Set connection var ansible_pipelining to False 11044 1726853265.68669: variable 'ansible_shell_executable' from source: unknown 11044 1726853265.68674: variable 'ansible_connection' from source: unknown 11044 1726853265.68677: variable 'ansible_module_compression' from source: unknown 11044 1726853265.68680: variable 'ansible_shell_type' from source: unknown 11044 1726853265.68683: variable 'ansible_shell_executable' from source: unknown 11044 1726853265.68685: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853265.68687: variable 'ansible_pipelining' from source: unknown 11044 1726853265.68689: variable 'ansible_timeout' from source: unknown 11044 1726853265.68693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853265.68845: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11044 1726853265.68856: variable 'omit' from source: magic vars 11044 1726853265.68859: starting attempt loop 11044 1726853265.68861: running the handler 11044 1726853265.68875: _low_level_execute_command(): starting 11044 1726853265.68882: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853265.69404: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853265.69408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853265.69412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853265.69465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853265.69468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853265.69474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853265.69522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853265.71216: stdout chunk (state=3): >>>/root <<< 11044 1726853265.71311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853265.71348: stderr chunk (state=3): >>><<< 11044 1726853265.71352: stdout chunk (state=3): >>><<< 11044 1726853265.71375: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853265.71386: _low_level_execute_command(): starting 11044 1726853265.71392: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853265.713738-12460-128105881895733 `" && echo ansible-tmp-1726853265.713738-12460-128105881895733="` echo /root/.ansible/tmp/ansible-tmp-1726853265.713738-12460-128105881895733 `" ) && sleep 0' 11044 1726853265.71858: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853265.71862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853265.71864: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853265.71879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853265.71882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853265.71925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853265.71929: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853265.71935: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853265.71977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853265.73868: stdout chunk (state=3): >>>ansible-tmp-1726853265.713738-12460-128105881895733=/root/.ansible/tmp/ansible-tmp-1726853265.713738-12460-128105881895733 <<< 11044 1726853265.73978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853265.74007: stderr chunk (state=3): >>><<< 11044 1726853265.74010: stdout chunk (state=3): >>><<< 11044 1726853265.74027: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853265.713738-12460-128105881895733=/root/.ansible/tmp/ansible-tmp-1726853265.713738-12460-128105881895733 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853265.74075: variable 'ansible_module_compression' from source: unknown 11044 1726853265.74110: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 11044 1726853265.74141: variable 'ansible_facts' from source: unknown 11044 1726853265.74199: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853265.713738-12460-128105881895733/AnsiballZ_ping.py 11044 1726853265.74306: Sending initial data 11044 1726853265.74310: Sent initial data (152 bytes) 11044 1726853265.74774: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853265.74777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853265.74780: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853265.74782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 11044 1726853265.74785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853265.74788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853265.74838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853265.74842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853265.74845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853265.74892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853265.76427: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853265.76462: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853265.76502: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpqudn_fu7 /root/.ansible/tmp/ansible-tmp-1726853265.713738-12460-128105881895733/AnsiballZ_ping.py <<< 11044 1726853265.76505: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853265.713738-12460-128105881895733/AnsiballZ_ping.py" <<< 11044 1726853265.76544: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpqudn_fu7" to remote "/root/.ansible/tmp/ansible-tmp-1726853265.713738-12460-128105881895733/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853265.713738-12460-128105881895733/AnsiballZ_ping.py" <<< 11044 1726853265.77051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853265.77101: stderr chunk (state=3): >>><<< 11044 1726853265.77104: stdout chunk (state=3): >>><<< 11044 1726853265.77122: done transferring module to remote 11044 1726853265.77132: _low_level_execute_command(): starting 11044 1726853265.77136: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853265.713738-12460-128105881895733/ /root/.ansible/tmp/ansible-tmp-1726853265.713738-12460-128105881895733/AnsiballZ_ping.py && sleep 0' 11044 1726853265.77620: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853265.77623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853265.77626: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853265.77628: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 11044 1726853265.77631: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853265.77637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853265.77687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853265.77691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853265.77694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853265.77738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853265.79487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853265.79516: stderr chunk (state=3): >>><<< 11044 1726853265.79520: stdout chunk (state=3): >>><<< 11044 1726853265.79529: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853265.79532: _low_level_execute_command(): starting 11044 1726853265.79537: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853265.713738-12460-128105881895733/AnsiballZ_ping.py && sleep 0' 11044 1726853265.79994: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853265.79997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853265.80000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 11044 1726853265.80002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853265.80005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853265.80061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853265.80064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853265.80103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853265.94984: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11044 1726853265.96083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853265.96135: stderr chunk (state=3): >>><<< 11044 1726853265.96138: stdout chunk (state=3): >>><<< 11044 1726853265.96158: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853265.96183: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853265.713738-12460-128105881895733/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853265.96192: _low_level_execute_command(): starting 11044 1726853265.96278: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853265.713738-12460-128105881895733/ > /dev/null 2>&1 && sleep 0' 11044 1726853265.96877: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853265.96880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853265.96883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853265.96886: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853265.96889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853265.96891: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853265.96921: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853265.96929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853265.97002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853265.99205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853265.99210: stdout chunk (state=3): >>><<< 11044 1726853265.99212: stderr chunk (state=3): >>><<< 11044 1726853265.99220: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853265.99226: handler run complete 11044 1726853265.99283: attempt loop complete, returning result 11044 1726853265.99287: _execute() done 11044 1726853265.99289: dumping result to json 11044 1726853265.99292: done dumping result, returning 11044 1726853265.99294: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-c5a6-f857-000000000092] 11044 1726853265.99313: sending task result for task 02083763-bbaf-c5a6-f857-000000000092 11044 1726853265.99379: done sending task result for task 02083763-bbaf-c5a6-f857-000000000092 11044 1726853265.99382: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 11044 1726853265.99486: no more pending results, returning what we have 11044 1726853265.99489: results queue empty 11044 1726853265.99490: checking for any_errors_fatal 11044 1726853265.99498: done checking for any_errors_fatal 11044 1726853265.99499: checking for max_fail_percentage 11044 1726853265.99501: done checking for max_fail_percentage 11044 1726853265.99501: checking to see if all hosts have failed and the running result is not ok 11044 1726853265.99502: done checking to see if all hosts have failed 11044 1726853265.99503: getting the remaining hosts for this loop 11044 1726853265.99504: done getting the remaining hosts for this loop 11044 1726853265.99507: getting the next task for host managed_node1 11044 1726853265.99518: done getting next task for host managed_node1 11044 1726853265.99520: ^ task is: TASK: meta (role_complete) 11044 1726853265.99523: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853265.99534: getting variables 11044 1726853265.99536: in VariableManager get_vars() 11044 1726853265.99626: Calling all_inventory to load vars for managed_node1 11044 1726853265.99629: Calling groups_inventory to load vars for managed_node1 11044 1726853265.99631: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853265.99640: Calling all_plugins_play to load vars for managed_node1 11044 1726853265.99645: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853265.99649: Calling groups_plugins_play to load vars for managed_node1 11044 1726853266.00983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853266.03707: done with get_vars() 11044 1726853266.03739: done getting variables 11044 1726853266.03839: done queuing things up, now waiting for results queue to drain 11044 1726853266.03841: results queue empty 11044 1726853266.03842: checking for any_errors_fatal 11044 1726853266.03845: done checking for any_errors_fatal 11044 1726853266.03846: checking for max_fail_percentage 11044 1726853266.03847: done checking for max_fail_percentage 11044 1726853266.03848: checking to see if all hosts have failed and the running result is not ok 11044 1726853266.03848: done checking to see if all hosts have failed 11044 1726853266.03849: getting the remaining hosts for this loop 11044 1726853266.03850: done getting the remaining hosts for this loop 11044 1726853266.03853: getting the next task for host managed_node1 11044 1726853266.03858: done getting next task for host managed_node1 11044 1726853266.03861: ^ task is: TASK: Delete the device '{{ controller_device }}' 11044 1726853266.03863: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853266.03865: getting variables 11044 1726853266.03866: in VariableManager get_vars() 11044 1726853266.03884: Calling all_inventory to load vars for managed_node1 11044 1726853266.03886: Calling groups_inventory to load vars for managed_node1 11044 1726853266.03888: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853266.03893: Calling all_plugins_play to load vars for managed_node1 11044 1726853266.03895: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853266.03898: Calling groups_plugins_play to load vars for managed_node1 11044 1726853266.05563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853266.07183: done with get_vars() 11044 1726853266.07209: done getting variables 11044 1726853266.07258: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11044 1726853266.07450: variable 'controller_device' from source: play vars TASK [Delete the device 'deprecated-bond'] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:125 Friday 20 September 2024 13:27:46 -0400 (0:00:00.401) 0:00:30.450 ****** 11044 1726853266.07482: entering _queue_task() for managed_node1/command 11044 1726853266.08118: worker is 1 (out of 1 available) 11044 1726853266.08130: exiting _queue_task() for managed_node1/command 11044 1726853266.08147: done queuing things up, now waiting for results queue to drain 11044 1726853266.08148: waiting for pending results... 11044 1726853266.08887: running TaskExecutor() for managed_node1/TASK: Delete the device 'deprecated-bond' 11044 1726853266.08893: in run() - task 02083763-bbaf-c5a6-f857-0000000000c2 11044 1726853266.08896: variable 'ansible_search_path' from source: unknown 11044 1726853266.08899: calling self._execute() 11044 1726853266.08999: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853266.09003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853266.09016: variable 'omit' from source: magic vars 11044 1726853266.09476: variable 'ansible_distribution_major_version' from source: facts 11044 1726853266.09480: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853266.09483: variable 'omit' from source: magic vars 11044 1726853266.09485: variable 'omit' from source: magic vars 11044 1726853266.09569: variable 'controller_device' from source: play vars 11044 1726853266.09590: variable 'omit' from source: magic vars 11044 1726853266.09659: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853266.09701: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853266.09722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853266.09740: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853266.09754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853266.09789: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853266.09793: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853266.09795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853266.09937: Set connection var ansible_timeout to 10 11044 1726853266.09947: Set connection var ansible_shell_executable to /bin/sh 11044 1726853266.09954: Set connection var ansible_shell_type to sh 11044 1726853266.09961: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853266.09965: Set connection var ansible_connection to ssh 11044 1726853266.09967: Set connection var ansible_pipelining to False 11044 1726853266.09995: variable 'ansible_shell_executable' from source: unknown 11044 1726853266.09998: variable 'ansible_connection' from source: unknown 11044 1726853266.10002: variable 'ansible_module_compression' from source: unknown 11044 1726853266.10004: variable 'ansible_shell_type' from source: unknown 11044 1726853266.10006: variable 'ansible_shell_executable' from source: unknown 11044 1726853266.10008: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853266.10013: variable 'ansible_pipelining' from source: unknown 11044 1726853266.10016: variable 'ansible_timeout' from source: unknown 11044 1726853266.10020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853266.10191: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853266.10278: variable 'omit' from source: magic vars 11044 1726853266.10286: starting attempt loop 11044 1726853266.10289: running the handler 11044 1726853266.10291: _low_level_execute_command(): starting 11044 1726853266.10293: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853266.11198: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853266.11220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853266.11241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853266.11255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853266.11327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853266.12995: stdout chunk (state=3): >>>/root <<< 11044 1726853266.13177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853266.13181: stdout chunk (state=3): >>><<< 11044 1726853266.13183: stderr chunk (state=3): >>><<< 11044 1726853266.13186: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853266.13189: _low_level_execute_command(): starting 11044 1726853266.13191: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853266.1316655-12479-263708544235332 `" && echo ansible-tmp-1726853266.1316655-12479-263708544235332="` echo /root/.ansible/tmp/ansible-tmp-1726853266.1316655-12479-263708544235332 `" ) && sleep 0' 11044 1726853266.13921: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853266.13925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853266.13968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853266.13974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 11044 1726853266.13987: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853266.14091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853266.14095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853266.14139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853266.16062: stdout chunk (state=3): >>>ansible-tmp-1726853266.1316655-12479-263708544235332=/root/.ansible/tmp/ansible-tmp-1726853266.1316655-12479-263708544235332 <<< 11044 1726853266.16186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853266.16220: stderr chunk (state=3): >>><<< 11044 1726853266.16238: stdout chunk (state=3): >>><<< 11044 1726853266.16476: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853266.1316655-12479-263708544235332=/root/.ansible/tmp/ansible-tmp-1726853266.1316655-12479-263708544235332 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853266.16480: variable 'ansible_module_compression' from source: unknown 11044 1726853266.16482: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11044 1726853266.16484: variable 'ansible_facts' from source: unknown 11044 1726853266.16493: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853266.1316655-12479-263708544235332/AnsiballZ_command.py 11044 1726853266.16692: Sending initial data 11044 1726853266.16705: Sent initial data (156 bytes) 11044 1726853266.17307: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853266.17319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853266.17335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853266.17357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853266.17470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853266.17503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853266.17520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853266.17592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853266.19153: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853266.19193: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853266.19234: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmp6m7ttcb4 /root/.ansible/tmp/ansible-tmp-1726853266.1316655-12479-263708544235332/AnsiballZ_command.py <<< 11044 1726853266.19237: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853266.1316655-12479-263708544235332/AnsiballZ_command.py" <<< 11044 1726853266.19304: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmp6m7ttcb4" to remote "/root/.ansible/tmp/ansible-tmp-1726853266.1316655-12479-263708544235332/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853266.1316655-12479-263708544235332/AnsiballZ_command.py" <<< 11044 1726853266.20076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853266.20145: stderr chunk (state=3): >>><<< 11044 1726853266.20155: stdout chunk (state=3): >>><<< 11044 1726853266.20226: done transferring module to remote 11044 1726853266.20229: _low_level_execute_command(): starting 11044 1726853266.20232: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853266.1316655-12479-263708544235332/ /root/.ansible/tmp/ansible-tmp-1726853266.1316655-12479-263708544235332/AnsiballZ_command.py && sleep 0' 11044 1726853266.20889: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853266.20996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853266.21017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853266.21037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853266.21112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853266.22943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853266.22949: stdout chunk (state=3): >>><<< 11044 1726853266.22956: stderr chunk (state=3): >>><<< 11044 1726853266.22984: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853266.22988: _low_level_execute_command(): starting 11044 1726853266.22993: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853266.1316655-12479-263708544235332/AnsiballZ_command.py && sleep 0' 11044 1726853266.23776: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853266.23779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853266.23782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853266.23784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853266.23786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853266.23788: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853266.23789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853266.23791: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853266.23793: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 11044 1726853266.23795: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11044 1726853266.23796: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853266.23798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853266.23800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853266.23843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853266.23853: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853266.23867: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853266.23882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853266.23898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853266.23975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853266.40099: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"deprecated-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "deprecated-bond"], "start": "2024-09-20 13:27:46.391127", "end": "2024-09-20 13:27:46.398067", "delta": "0:00:00.006940", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11044 1726853266.41398: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.45.153 closed. <<< 11044 1726853266.41452: stderr chunk (state=3): >>><<< 11044 1726853266.41488: stdout chunk (state=3): >>><<< 11044 1726853266.41815: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"deprecated-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "deprecated-bond"], "start": "2024-09-20 13:27:46.391127", "end": "2024-09-20 13:27:46.398067", "delta": "0:00:00.006940", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.45.153 closed. 11044 1726853266.41818: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853266.1316655-12479-263708544235332/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853266.41822: _low_level_execute_command(): starting 11044 1726853266.41825: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853266.1316655-12479-263708544235332/ > /dev/null 2>&1 && sleep 0' 11044 1726853266.43049: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853266.43086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853266.43163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853266.45236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853266.45255: stdout chunk (state=3): >>><<< 11044 1726853266.45269: stderr chunk (state=3): >>><<< 11044 1726853266.45295: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853266.45308: handler run complete 11044 1726853266.45577: Evaluated conditional (False): False 11044 1726853266.45580: Evaluated conditional (False): False 11044 1726853266.45582: attempt loop complete, returning result 11044 1726853266.45584: _execute() done 11044 1726853266.45586: dumping result to json 11044 1726853266.45588: done dumping result, returning 11044 1726853266.45590: done running TaskExecutor() for managed_node1/TASK: Delete the device 'deprecated-bond' [02083763-bbaf-c5a6-f857-0000000000c2] 11044 1726853266.45591: sending task result for task 02083763-bbaf-c5a6-f857-0000000000c2 11044 1726853266.46054: done sending task result for task 02083763-bbaf-c5a6-f857-0000000000c2 11044 1726853266.46058: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "deprecated-bond" ], "delta": "0:00:00.006940", "end": "2024-09-20 13:27:46.398067", "failed_when_result": false, "rc": 1, "start": "2024-09-20 13:27:46.391127" } STDERR: Cannot find device "deprecated-bond" MSG: non-zero return code 11044 1726853266.46185: no more pending results, returning what we have 11044 1726853266.46188: results queue empty 11044 1726853266.46190: checking for any_errors_fatal 11044 1726853266.46192: done checking for any_errors_fatal 11044 1726853266.46193: checking for max_fail_percentage 11044 1726853266.46194: done checking for max_fail_percentage 11044 1726853266.46196: checking to see if all hosts have failed and the running result is not ok 11044 1726853266.46196: done checking to see if all hosts have failed 11044 1726853266.46197: getting the remaining hosts for this loop 11044 1726853266.46199: done getting the remaining hosts for this loop 11044 1726853266.46202: getting the next task for host managed_node1 11044 1726853266.46213: done getting next task for host managed_node1 11044 1726853266.46216: ^ task is: TASK: Remove test interfaces 11044 1726853266.46221: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853266.46227: getting variables 11044 1726853266.46229: in VariableManager get_vars() 11044 1726853266.46411: Calling all_inventory to load vars for managed_node1 11044 1726853266.46414: Calling groups_inventory to load vars for managed_node1 11044 1726853266.46416: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853266.46427: Calling all_plugins_play to load vars for managed_node1 11044 1726853266.46430: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853266.46432: Calling groups_plugins_play to load vars for managed_node1 11044 1726853266.48525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853266.50218: done with get_vars() 11044 1726853266.50243: done getting variables 11044 1726853266.50307: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 13:27:46 -0400 (0:00:00.428) 0:00:30.879 ****** 11044 1726853266.50343: entering _queue_task() for managed_node1/shell 11044 1726853266.50704: worker is 1 (out of 1 available) 11044 1726853266.50717: exiting _queue_task() for managed_node1/shell 11044 1726853266.50730: done queuing things up, now waiting for results queue to drain 11044 1726853266.50731: waiting for pending results... 11044 1726853266.51027: running TaskExecutor() for managed_node1/TASK: Remove test interfaces 11044 1726853266.51166: in run() - task 02083763-bbaf-c5a6-f857-0000000000c6 11044 1726853266.51193: variable 'ansible_search_path' from source: unknown 11044 1726853266.51203: variable 'ansible_search_path' from source: unknown 11044 1726853266.51242: calling self._execute() 11044 1726853266.51347: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853266.51360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853266.51377: variable 'omit' from source: magic vars 11044 1726853266.51976: variable 'ansible_distribution_major_version' from source: facts 11044 1726853266.51979: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853266.51982: variable 'omit' from source: magic vars 11044 1726853266.51984: variable 'omit' from source: magic vars 11044 1726853266.52007: variable 'dhcp_interface1' from source: play vars 11044 1726853266.52017: variable 'dhcp_interface2' from source: play vars 11044 1726853266.52038: variable 'omit' from source: magic vars 11044 1726853266.52086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853266.52128: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853266.52152: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853266.52173: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853266.52191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853266.52231: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853266.52239: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853266.52246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853266.52348: Set connection var ansible_timeout to 10 11044 1726853266.52364: Set connection var ansible_shell_executable to /bin/sh 11044 1726853266.52373: Set connection var ansible_shell_type to sh 11044 1726853266.52385: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853266.52395: Set connection var ansible_connection to ssh 11044 1726853266.52405: Set connection var ansible_pipelining to False 11044 1726853266.52439: variable 'ansible_shell_executable' from source: unknown 11044 1726853266.52446: variable 'ansible_connection' from source: unknown 11044 1726853266.52452: variable 'ansible_module_compression' from source: unknown 11044 1726853266.52458: variable 'ansible_shell_type' from source: unknown 11044 1726853266.52464: variable 'ansible_shell_executable' from source: unknown 11044 1726853266.52470: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853266.52479: variable 'ansible_pipelining' from source: unknown 11044 1726853266.52485: variable 'ansible_timeout' from source: unknown 11044 1726853266.52492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853266.52636: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853266.52656: variable 'omit' from source: magic vars 11044 1726853266.52665: starting attempt loop 11044 1726853266.52673: running the handler 11044 1726853266.52687: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853266.52709: _low_level_execute_command(): starting 11044 1726853266.52752: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853266.53473: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853266.53489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853266.53510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853266.53535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853266.53631: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853266.53650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853266.53680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853266.53697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853266.53842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853266.55424: stdout chunk (state=3): >>>/root <<< 11044 1726853266.55643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853266.55652: stdout chunk (state=3): >>><<< 11044 1726853266.55661: stderr chunk (state=3): >>><<< 11044 1726853266.55690: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853266.55706: _low_level_execute_command(): starting 11044 1726853266.55709: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853266.5569012-12502-175545580267231 `" && echo ansible-tmp-1726853266.5569012-12502-175545580267231="` echo /root/.ansible/tmp/ansible-tmp-1726853266.5569012-12502-175545580267231 `" ) && sleep 0' 11044 1726853266.56793: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853266.56809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853266.56827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853266.56848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853266.56888: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853266.56900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853266.56981: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853266.57004: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853266.57021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853266.57107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853266.59054: stdout chunk (state=3): >>>ansible-tmp-1726853266.5569012-12502-175545580267231=/root/.ansible/tmp/ansible-tmp-1726853266.5569012-12502-175545580267231 <<< 11044 1726853266.59212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853266.59216: stderr chunk (state=3): >>><<< 11044 1726853266.59219: stdout chunk (state=3): >>><<< 11044 1726853266.59287: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853266.5569012-12502-175545580267231=/root/.ansible/tmp/ansible-tmp-1726853266.5569012-12502-175545580267231 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853266.59321: variable 'ansible_module_compression' from source: unknown 11044 1726853266.59378: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11044 1726853266.59677: variable 'ansible_facts' from source: unknown 11044 1726853266.59737: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853266.5569012-12502-175545580267231/AnsiballZ_command.py 11044 1726853266.60046: Sending initial data 11044 1726853266.60053: Sent initial data (156 bytes) 11044 1726853266.61450: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853266.61533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853266.61622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853266.61867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853266.63421: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853266.63632: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853266.5569012-12502-175545580267231/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpwu6nmk14" to remote "/root/.ansible/tmp/ansible-tmp-1726853266.5569012-12502-175545580267231/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853266.5569012-12502-175545580267231/AnsiballZ_command.py" <<< 11044 1726853266.63637: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpwu6nmk14 /root/.ansible/tmp/ansible-tmp-1726853266.5569012-12502-175545580267231/AnsiballZ_command.py <<< 11044 1726853266.64860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853266.64974: stderr chunk (state=3): >>><<< 11044 1726853266.64978: stdout chunk (state=3): >>><<< 11044 1726853266.65000: done transferring module to remote 11044 1726853266.65010: _low_level_execute_command(): starting 11044 1726853266.65016: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853266.5569012-12502-175545580267231/ /root/.ansible/tmp/ansible-tmp-1726853266.5569012-12502-175545580267231/AnsiballZ_command.py && sleep 0' 11044 1726853266.65614: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853266.65622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853266.65633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853266.65650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853266.65661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853266.65670: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853266.65714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853266.65717: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853266.65774: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853266.65797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853266.65863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853266.67949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853266.67954: stdout chunk (state=3): >>><<< 11044 1726853266.67956: stderr chunk (state=3): >>><<< 11044 1726853266.67959: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853266.67965: _low_level_execute_command(): starting 11044 1726853266.67970: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853266.5569012-12502-175545580267231/AnsiballZ_command.py && sleep 0' 11044 1726853266.68978: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853266.69092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853266.69124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853266.69138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853266.69214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853266.88601: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 13:27:46.843040", "end": "2024-09-20 13:27:46.885239", "delta": "0:00:00.042199", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 11044 1726853266.88784: stdout chunk (state=3): >>> <<< 11044 1726853266.90237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853266.90241: stdout chunk (state=3): >>><<< 11044 1726853266.90248: stderr chunk (state=3): >>><<< 11044 1726853266.90275: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 13:27:46.843040", "end": "2024-09-20 13:27:46.885239", "delta": "0:00:00.042199", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853266.90316: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853266.5569012-12502-175545580267231/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853266.90324: _low_level_execute_command(): starting 11044 1726853266.90329: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853266.5569012-12502-175545580267231/ > /dev/null 2>&1 && sleep 0' 11044 1726853266.90959: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853266.90968: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853266.90982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853266.90995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853266.91007: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853266.91014: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853266.91176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853266.91184: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853266.91186: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 11044 1726853266.91189: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11044 1726853266.91191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853266.91193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853266.91230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853266.93177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853266.93180: stderr chunk (state=3): >>><<< 11044 1726853266.93182: stdout chunk (state=3): >>><<< 11044 1726853266.93185: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853266.93187: handler run complete 11044 1726853266.93189: Evaluated conditional (False): False 11044 1726853266.93199: attempt loop complete, returning result 11044 1726853266.93202: _execute() done 11044 1726853266.93204: dumping result to json 11044 1726853266.93209: done dumping result, returning 11044 1726853266.93218: done running TaskExecutor() for managed_node1/TASK: Remove test interfaces [02083763-bbaf-c5a6-f857-0000000000c6] 11044 1726853266.93220: sending task result for task 02083763-bbaf-c5a6-f857-0000000000c6 11044 1726853266.93417: done sending task result for task 02083763-bbaf-c5a6-f857-0000000000c6 ok: [managed_node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.042199", "end": "2024-09-20 13:27:46.885239", "rc": 0, "start": "2024-09-20 13:27:46.843040" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 11044 1726853266.93497: no more pending results, returning what we have 11044 1726853266.93577: results queue empty 11044 1726853266.93579: checking for any_errors_fatal 11044 1726853266.93620: done checking for any_errors_fatal 11044 1726853266.93621: checking for max_fail_percentage 11044 1726853266.93624: done checking for max_fail_percentage 11044 1726853266.93625: checking to see if all hosts have failed and the running result is not ok 11044 1726853266.93626: done checking to see if all hosts have failed 11044 1726853266.93626: getting the remaining hosts for this loop 11044 1726853266.93628: done getting the remaining hosts for this loop 11044 1726853266.93631: getting the next task for host managed_node1 11044 1726853266.93640: done getting next task for host managed_node1 11044 1726853266.93643: ^ task is: TASK: Stop dnsmasq/radvd services 11044 1726853266.93647: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853266.93652: getting variables 11044 1726853266.93654: in VariableManager get_vars() 11044 1726853266.93799: Calling all_inventory to load vars for managed_node1 11044 1726853266.93802: Calling groups_inventory to load vars for managed_node1 11044 1726853266.93930: WORKER PROCESS EXITING 11044 1726853266.93937: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853266.93951: Calling all_plugins_play to load vars for managed_node1 11044 1726853266.93954: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853266.93957: Calling groups_plugins_play to load vars for managed_node1 11044 1726853266.95036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853266.95904: done with get_vars() 11044 1726853266.95918: done getting variables 11044 1726853266.95962: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 13:27:46 -0400 (0:00:00.456) 0:00:31.335 ****** 11044 1726853266.95988: entering _queue_task() for managed_node1/shell 11044 1726853266.96222: worker is 1 (out of 1 available) 11044 1726853266.96235: exiting _queue_task() for managed_node1/shell 11044 1726853266.96251: done queuing things up, now waiting for results queue to drain 11044 1726853266.96252: waiting for pending results... 11044 1726853266.96434: running TaskExecutor() for managed_node1/TASK: Stop dnsmasq/radvd services 11044 1726853266.96596: in run() - task 02083763-bbaf-c5a6-f857-0000000000c7 11044 1726853266.96629: variable 'ansible_search_path' from source: unknown 11044 1726853266.96716: variable 'ansible_search_path' from source: unknown 11044 1726853266.96720: calling self._execute() 11044 1726853266.96801: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853266.96815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853266.96827: variable 'omit' from source: magic vars 11044 1726853266.97236: variable 'ansible_distribution_major_version' from source: facts 11044 1726853266.97299: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853266.97324: variable 'omit' from source: magic vars 11044 1726853266.97400: variable 'omit' from source: magic vars 11044 1726853266.97453: variable 'omit' from source: magic vars 11044 1726853266.97517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853266.97551: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853266.97568: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853266.97583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853266.97597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853266.97618: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853266.97621: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853266.97624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853266.97703: Set connection var ansible_timeout to 10 11044 1726853266.97708: Set connection var ansible_shell_executable to /bin/sh 11044 1726853266.97711: Set connection var ansible_shell_type to sh 11044 1726853266.97718: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853266.97720: Set connection var ansible_connection to ssh 11044 1726853266.97725: Set connection var ansible_pipelining to False 11044 1726853266.97742: variable 'ansible_shell_executable' from source: unknown 11044 1726853266.97747: variable 'ansible_connection' from source: unknown 11044 1726853266.97750: variable 'ansible_module_compression' from source: unknown 11044 1726853266.97753: variable 'ansible_shell_type' from source: unknown 11044 1726853266.97755: variable 'ansible_shell_executable' from source: unknown 11044 1726853266.97757: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853266.97762: variable 'ansible_pipelining' from source: unknown 11044 1726853266.97765: variable 'ansible_timeout' from source: unknown 11044 1726853266.97768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853266.97873: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853266.97882: variable 'omit' from source: magic vars 11044 1726853266.97885: starting attempt loop 11044 1726853266.97888: running the handler 11044 1726853266.97898: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853266.97913: _low_level_execute_command(): starting 11044 1726853266.97920: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853266.98418: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853266.98422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853266.98425: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853266.98429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853266.98476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853266.98479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853266.98483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853266.98528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853267.00130: stdout chunk (state=3): >>>/root <<< 11044 1726853267.00285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853267.00288: stdout chunk (state=3): >>><<< 11044 1726853267.00290: stderr chunk (state=3): >>><<< 11044 1726853267.00401: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853267.00405: _low_level_execute_command(): starting 11044 1726853267.00408: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853267.0031374-12524-144962101630274 `" && echo ansible-tmp-1726853267.0031374-12524-144962101630274="` echo /root/.ansible/tmp/ansible-tmp-1726853267.0031374-12524-144962101630274 `" ) && sleep 0' 11044 1726853267.00887: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853267.00897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853267.00900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 11044 1726853267.00902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853267.00904: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853267.00939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853267.00945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853267.00991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853267.02865: stdout chunk (state=3): >>>ansible-tmp-1726853267.0031374-12524-144962101630274=/root/.ansible/tmp/ansible-tmp-1726853267.0031374-12524-144962101630274 <<< 11044 1726853267.02988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853267.02999: stderr chunk (state=3): >>><<< 11044 1726853267.03003: stdout chunk (state=3): >>><<< 11044 1726853267.03023: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853267.0031374-12524-144962101630274=/root/.ansible/tmp/ansible-tmp-1726853267.0031374-12524-144962101630274 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853267.03055: variable 'ansible_module_compression' from source: unknown 11044 1726853267.03147: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11044 1726853267.03150: variable 'ansible_facts' from source: unknown 11044 1726853267.03333: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853267.0031374-12524-144962101630274/AnsiballZ_command.py 11044 1726853267.03580: Sending initial data 11044 1726853267.03584: Sent initial data (156 bytes) 11044 1726853267.03940: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853267.03949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853267.03960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853267.03976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853267.03987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853267.04053: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853267.04094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853267.04105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853267.04114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853267.04208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853267.05755: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853267.05791: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853267.05831: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpruu_afdd /root/.ansible/tmp/ansible-tmp-1726853267.0031374-12524-144962101630274/AnsiballZ_command.py <<< 11044 1726853267.05834: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853267.0031374-12524-144962101630274/AnsiballZ_command.py" <<< 11044 1726853267.05872: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpruu_afdd" to remote "/root/.ansible/tmp/ansible-tmp-1726853267.0031374-12524-144962101630274/AnsiballZ_command.py" <<< 11044 1726853267.05878: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853267.0031374-12524-144962101630274/AnsiballZ_command.py" <<< 11044 1726853267.06577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853267.06581: stdout chunk (state=3): >>><<< 11044 1726853267.06583: stderr chunk (state=3): >>><<< 11044 1726853267.06605: done transferring module to remote 11044 1726853267.06624: _low_level_execute_command(): starting 11044 1726853267.06701: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853267.0031374-12524-144962101630274/ /root/.ansible/tmp/ansible-tmp-1726853267.0031374-12524-144962101630274/AnsiballZ_command.py && sleep 0' 11044 1726853267.07166: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853267.07184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853267.07206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853267.07236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853267.07253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853267.07295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853267.09055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853267.09080: stderr chunk (state=3): >>><<< 11044 1726853267.09083: stdout chunk (state=3): >>><<< 11044 1726853267.09097: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853267.09100: _low_level_execute_command(): starting 11044 1726853267.09106: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853267.0031374-12524-144962101630274/AnsiballZ_command.py && sleep 0' 11044 1726853267.09530: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853267.09534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853267.09579: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853267.09624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853267.09631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853267.09634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853267.09676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853267.27448: stdout chunk (state=3): >>> <<< 11044 1726853267.27470: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 13:27:47.245105", "end": "2024-09-20 13:27:47.272283", "delta": "0:00:00.027178", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11044 1726853267.29036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853267.29040: stdout chunk (state=3): >>><<< 11044 1726853267.29045: stderr chunk (state=3): >>><<< 11044 1726853267.29177: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 13:27:47.245105", "end": "2024-09-20 13:27:47.272283", "delta": "0:00:00.027178", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853267.29185: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853267.0031374-12524-144962101630274/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853267.29188: _low_level_execute_command(): starting 11044 1726853267.29233: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853267.0031374-12524-144962101630274/ > /dev/null 2>&1 && sleep 0' 11044 1726853267.30389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853267.30586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853267.30629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853267.30758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853267.30794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853267.30834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853267.32766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853267.32781: stdout chunk (state=3): >>><<< 11044 1726853267.32797: stderr chunk (state=3): >>><<< 11044 1726853267.32820: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853267.33021: handler run complete 11044 1726853267.33024: Evaluated conditional (False): False 11044 1726853267.33027: attempt loop complete, returning result 11044 1726853267.33029: _execute() done 11044 1726853267.33031: dumping result to json 11044 1726853267.33033: done dumping result, returning 11044 1726853267.33035: done running TaskExecutor() for managed_node1/TASK: Stop dnsmasq/radvd services [02083763-bbaf-c5a6-f857-0000000000c7] 11044 1726853267.33037: sending task result for task 02083763-bbaf-c5a6-f857-0000000000c7 ok: [managed_node1] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.027178", "end": "2024-09-20 13:27:47.272283", "rc": 0, "start": "2024-09-20 13:27:47.245105" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 11044 1726853267.33197: no more pending results, returning what we have 11044 1726853267.33200: results queue empty 11044 1726853267.33201: checking for any_errors_fatal 11044 1726853267.33212: done checking for any_errors_fatal 11044 1726853267.33213: checking for max_fail_percentage 11044 1726853267.33214: done checking for max_fail_percentage 11044 1726853267.33215: checking to see if all hosts have failed and the running result is not ok 11044 1726853267.33216: done checking to see if all hosts have failed 11044 1726853267.33217: getting the remaining hosts for this loop 11044 1726853267.33218: done getting the remaining hosts for this loop 11044 1726853267.33221: getting the next task for host managed_node1 11044 1726853267.33230: done getting next task for host managed_node1 11044 1726853267.33233: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 11044 1726853267.33236: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853267.33240: getting variables 11044 1726853267.33245: in VariableManager get_vars() 11044 1726853267.33291: Calling all_inventory to load vars for managed_node1 11044 1726853267.33294: Calling groups_inventory to load vars for managed_node1 11044 1726853267.33297: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853267.33308: Calling all_plugins_play to load vars for managed_node1 11044 1726853267.33311: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853267.33314: Calling groups_plugins_play to load vars for managed_node1 11044 1726853267.34207: done sending task result for task 02083763-bbaf-c5a6-f857-0000000000c7 11044 1726853267.34215: WORKER PROCESS EXITING 11044 1726853267.36058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853267.38194: done with get_vars() 11044 1726853267.38218: done getting variables 11044 1726853267.38286: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:131 Friday 20 September 2024 13:27:47 -0400 (0:00:00.423) 0:00:31.758 ****** 11044 1726853267.38317: entering _queue_task() for managed_node1/command 11044 1726853267.39078: worker is 1 (out of 1 available) 11044 1726853267.39093: exiting _queue_task() for managed_node1/command 11044 1726853267.39108: done queuing things up, now waiting for results queue to drain 11044 1726853267.39110: waiting for pending results... 11044 1726853267.39549: running TaskExecutor() for managed_node1/TASK: Restore the /etc/resolv.conf for initscript 11044 1726853267.39733: in run() - task 02083763-bbaf-c5a6-f857-0000000000c8 11044 1726853267.39756: variable 'ansible_search_path' from source: unknown 11044 1726853267.39822: calling self._execute() 11044 1726853267.39928: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853267.39939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853267.39954: variable 'omit' from source: magic vars 11044 1726853267.40337: variable 'ansible_distribution_major_version' from source: facts 11044 1726853267.40355: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853267.40475: variable 'network_provider' from source: set_fact 11044 1726853267.40486: Evaluated conditional (network_provider == "initscripts"): False 11044 1726853267.40493: when evaluation is False, skipping this task 11044 1726853267.40499: _execute() done 11044 1726853267.40510: dumping result to json 11044 1726853267.40517: done dumping result, returning 11044 1726853267.40577: done running TaskExecutor() for managed_node1/TASK: Restore the /etc/resolv.conf for initscript [02083763-bbaf-c5a6-f857-0000000000c8] 11044 1726853267.40579: sending task result for task 02083763-bbaf-c5a6-f857-0000000000c8 11044 1726853267.40774: done sending task result for task 02083763-bbaf-c5a6-f857-0000000000c8 11044 1726853267.40778: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11044 1726853267.40812: no more pending results, returning what we have 11044 1726853267.40814: results queue empty 11044 1726853267.40815: checking for any_errors_fatal 11044 1726853267.40823: done checking for any_errors_fatal 11044 1726853267.40823: checking for max_fail_percentage 11044 1726853267.40825: done checking for max_fail_percentage 11044 1726853267.40826: checking to see if all hosts have failed and the running result is not ok 11044 1726853267.40826: done checking to see if all hosts have failed 11044 1726853267.40827: getting the remaining hosts for this loop 11044 1726853267.40828: done getting the remaining hosts for this loop 11044 1726853267.40831: getting the next task for host managed_node1 11044 1726853267.40837: done getting next task for host managed_node1 11044 1726853267.40840: ^ task is: TASK: Verify network state restored to default 11044 1726853267.40845: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853267.40849: getting variables 11044 1726853267.40851: in VariableManager get_vars() 11044 1726853267.40886: Calling all_inventory to load vars for managed_node1 11044 1726853267.40889: Calling groups_inventory to load vars for managed_node1 11044 1726853267.40891: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853267.40900: Calling all_plugins_play to load vars for managed_node1 11044 1726853267.40903: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853267.40906: Calling groups_plugins_play to load vars for managed_node1 11044 1726853267.42504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853267.44572: done with get_vars() 11044 1726853267.44602: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:136 Friday 20 September 2024 13:27:47 -0400 (0:00:00.063) 0:00:31.822 ****** 11044 1726853267.44705: entering _queue_task() for managed_node1/include_tasks 11044 1726853267.45077: worker is 1 (out of 1 available) 11044 1726853267.45091: exiting _queue_task() for managed_node1/include_tasks 11044 1726853267.45105: done queuing things up, now waiting for results queue to drain 11044 1726853267.45107: waiting for pending results... 11044 1726853267.45439: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 11044 1726853267.45880: in run() - task 02083763-bbaf-c5a6-f857-0000000000c9 11044 1726853267.45884: variable 'ansible_search_path' from source: unknown 11044 1726853267.45913: calling self._execute() 11044 1726853267.46119: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853267.46193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853267.46215: variable 'omit' from source: magic vars 11044 1726853267.46887: variable 'ansible_distribution_major_version' from source: facts 11044 1726853267.46891: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853267.46893: _execute() done 11044 1726853267.46896: dumping result to json 11044 1726853267.46898: done dumping result, returning 11044 1726853267.46901: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [02083763-bbaf-c5a6-f857-0000000000c9] 11044 1726853267.46903: sending task result for task 02083763-bbaf-c5a6-f857-0000000000c9 11044 1726853267.47178: done sending task result for task 02083763-bbaf-c5a6-f857-0000000000c9 11044 1726853267.47182: WORKER PROCESS EXITING 11044 1726853267.47211: no more pending results, returning what we have 11044 1726853267.47216: in VariableManager get_vars() 11044 1726853267.47270: Calling all_inventory to load vars for managed_node1 11044 1726853267.47276: Calling groups_inventory to load vars for managed_node1 11044 1726853267.47279: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853267.47294: Calling all_plugins_play to load vars for managed_node1 11044 1726853267.47297: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853267.47300: Calling groups_plugins_play to load vars for managed_node1 11044 1726853267.49439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853267.52533: done with get_vars() 11044 1726853267.52557: variable 'ansible_search_path' from source: unknown 11044 1726853267.52698: we have included files to process 11044 1726853267.52699: generating all_blocks data 11044 1726853267.52701: done generating all_blocks data 11044 1726853267.52707: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11044 1726853267.52708: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11044 1726853267.52711: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11044 1726853267.53931: done processing included file 11044 1726853267.53933: iterating over new_blocks loaded from include file 11044 1726853267.53935: in VariableManager get_vars() 11044 1726853267.53956: done with get_vars() 11044 1726853267.53958: filtering new block on tags 11044 1726853267.53997: done filtering new block on tags 11044 1726853267.54000: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 11044 1726853267.54006: extending task lists for all hosts with included blocks 11044 1726853267.55584: done extending task lists 11044 1726853267.55586: done processing included files 11044 1726853267.55586: results queue empty 11044 1726853267.55587: checking for any_errors_fatal 11044 1726853267.55590: done checking for any_errors_fatal 11044 1726853267.55591: checking for max_fail_percentage 11044 1726853267.55592: done checking for max_fail_percentage 11044 1726853267.55593: checking to see if all hosts have failed and the running result is not ok 11044 1726853267.55594: done checking to see if all hosts have failed 11044 1726853267.55595: getting the remaining hosts for this loop 11044 1726853267.55596: done getting the remaining hosts for this loop 11044 1726853267.55598: getting the next task for host managed_node1 11044 1726853267.55603: done getting next task for host managed_node1 11044 1726853267.55605: ^ task is: TASK: Check routes and DNS 11044 1726853267.55608: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853267.55611: getting variables 11044 1726853267.55612: in VariableManager get_vars() 11044 1726853267.55628: Calling all_inventory to load vars for managed_node1 11044 1726853267.55630: Calling groups_inventory to load vars for managed_node1 11044 1726853267.55633: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853267.55639: Calling all_plugins_play to load vars for managed_node1 11044 1726853267.55641: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853267.55647: Calling groups_plugins_play to load vars for managed_node1 11044 1726853267.57038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853267.58818: done with get_vars() 11044 1726853267.58838: done getting variables 11044 1726853267.58886: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 13:27:47 -0400 (0:00:00.142) 0:00:31.964 ****** 11044 1726853267.58915: entering _queue_task() for managed_node1/shell 11044 1726853267.59261: worker is 1 (out of 1 available) 11044 1726853267.59377: exiting _queue_task() for managed_node1/shell 11044 1726853267.59389: done queuing things up, now waiting for results queue to drain 11044 1726853267.59390: waiting for pending results... 11044 1726853267.59699: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 11044 1726853267.59709: in run() - task 02083763-bbaf-c5a6-f857-000000000570 11044 1726853267.59714: variable 'ansible_search_path' from source: unknown 11044 1726853267.59717: variable 'ansible_search_path' from source: unknown 11044 1726853267.59753: calling self._execute() 11044 1726853267.59862: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853267.59868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853267.59879: variable 'omit' from source: magic vars 11044 1726853267.60405: variable 'ansible_distribution_major_version' from source: facts 11044 1726853267.60409: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853267.60411: variable 'omit' from source: magic vars 11044 1726853267.60414: variable 'omit' from source: magic vars 11044 1726853267.60416: variable 'omit' from source: magic vars 11044 1726853267.60426: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853267.60676: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853267.60680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853267.60683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853267.60685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853267.60687: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853267.60690: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853267.60692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853267.60694: Set connection var ansible_timeout to 10 11044 1726853267.60696: Set connection var ansible_shell_executable to /bin/sh 11044 1726853267.60698: Set connection var ansible_shell_type to sh 11044 1726853267.60700: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853267.60702: Set connection var ansible_connection to ssh 11044 1726853267.60704: Set connection var ansible_pipelining to False 11044 1726853267.60729: variable 'ansible_shell_executable' from source: unknown 11044 1726853267.60732: variable 'ansible_connection' from source: unknown 11044 1726853267.60735: variable 'ansible_module_compression' from source: unknown 11044 1726853267.60737: variable 'ansible_shell_type' from source: unknown 11044 1726853267.60740: variable 'ansible_shell_executable' from source: unknown 11044 1726853267.60742: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853267.60749: variable 'ansible_pipelining' from source: unknown 11044 1726853267.60752: variable 'ansible_timeout' from source: unknown 11044 1726853267.60754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853267.60905: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853267.60915: variable 'omit' from source: magic vars 11044 1726853267.60920: starting attempt loop 11044 1726853267.60929: running the handler 11044 1726853267.60932: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853267.60958: _low_level_execute_command(): starting 11044 1726853267.60966: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853267.61881: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853267.61886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853267.61889: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853267.61891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853267.62002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853267.63705: stdout chunk (state=3): >>>/root <<< 11044 1726853267.63842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853267.63860: stdout chunk (state=3): >>><<< 11044 1726853267.63880: stderr chunk (state=3): >>><<< 11044 1726853267.63908: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853267.63931: _low_level_execute_command(): starting 11044 1726853267.63943: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853267.6391585-12559-203867589571905 `" && echo ansible-tmp-1726853267.6391585-12559-203867589571905="` echo /root/.ansible/tmp/ansible-tmp-1726853267.6391585-12559-203867589571905 `" ) && sleep 0' 11044 1726853267.64577: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853267.64617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853267.64639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853267.64659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853267.64702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853267.64747: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853267.64841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853267.64883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853267.64970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853267.66897: stdout chunk (state=3): >>>ansible-tmp-1726853267.6391585-12559-203867589571905=/root/.ansible/tmp/ansible-tmp-1726853267.6391585-12559-203867589571905 <<< 11044 1726853267.66998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853267.67027: stderr chunk (state=3): >>><<< 11044 1726853267.67030: stdout chunk (state=3): >>><<< 11044 1726853267.67049: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853267.6391585-12559-203867589571905=/root/.ansible/tmp/ansible-tmp-1726853267.6391585-12559-203867589571905 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853267.67078: variable 'ansible_module_compression' from source: unknown 11044 1726853267.67124: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11044 1726853267.67158: variable 'ansible_facts' from source: unknown 11044 1726853267.67213: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853267.6391585-12559-203867589571905/AnsiballZ_command.py 11044 1726853267.67317: Sending initial data 11044 1726853267.67321: Sent initial data (156 bytes) 11044 1726853267.67737: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853267.67745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853267.67774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853267.67778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853267.67780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853267.67829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853267.67833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853267.67880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853267.69416: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 11044 1726853267.69422: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853267.69454: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853267.69493: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmpl6ol_p8w /root/.ansible/tmp/ansible-tmp-1726853267.6391585-12559-203867589571905/AnsiballZ_command.py <<< 11044 1726853267.69500: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853267.6391585-12559-203867589571905/AnsiballZ_command.py" <<< 11044 1726853267.69532: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmpl6ol_p8w" to remote "/root/.ansible/tmp/ansible-tmp-1726853267.6391585-12559-203867589571905/AnsiballZ_command.py" <<< 11044 1726853267.69536: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853267.6391585-12559-203867589571905/AnsiballZ_command.py" <<< 11044 1726853267.70056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853267.70101: stderr chunk (state=3): >>><<< 11044 1726853267.70104: stdout chunk (state=3): >>><<< 11044 1726853267.70141: done transferring module to remote 11044 1726853267.70151: _low_level_execute_command(): starting 11044 1726853267.70156: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853267.6391585-12559-203867589571905/ /root/.ansible/tmp/ansible-tmp-1726853267.6391585-12559-203867589571905/AnsiballZ_command.py && sleep 0' 11044 1726853267.70574: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853267.70607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853267.70610: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 11044 1726853267.70612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853267.70618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853267.70621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853267.70623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853267.70659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853267.70680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853267.70715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853267.72677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853267.72682: stderr chunk (state=3): >>><<< 11044 1726853267.72684: stdout chunk (state=3): >>><<< 11044 1726853267.72687: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853267.72689: _low_level_execute_command(): starting 11044 1726853267.72692: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853267.6391585-12559-203867589571905/AnsiballZ_command.py && sleep 0' 11044 1726853267.73201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853267.73210: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853267.73220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853267.73243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853267.73258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853267.73265: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853267.73278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853267.73292: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11044 1726853267.73352: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853267.73394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853267.73405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853267.73424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853267.73495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853267.89727: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:3a:e7:40:bc:9f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.153/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3181sec preferred_lft 3181sec\n inet6 fe80::3a:e7ff:fe40:bc9f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.153 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.153 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:27:47.887582", "end": "2024-09-20 13:27:47.896342", "delta": "0:00:00.008760", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11044 1726853267.91286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853267.91313: stdout chunk (state=3): >>><<< 11044 1726853267.91317: stderr chunk (state=3): >>><<< 11044 1726853267.91336: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:3a:e7:40:bc:9f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.153/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3181sec preferred_lft 3181sec\n inet6 fe80::3a:e7ff:fe40:bc9f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.153 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.153 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:27:47.887582", "end": "2024-09-20 13:27:47.896342", "delta": "0:00:00.008760", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853267.91483: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853267.6391585-12559-203867589571905/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853267.91488: _low_level_execute_command(): starting 11044 1726853267.91490: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853267.6391585-12559-203867589571905/ > /dev/null 2>&1 && sleep 0' 11044 1726853267.92089: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853267.92105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853267.92119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853267.92139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853267.92247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853267.92266: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853267.92377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853267.94181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853267.94193: stdout chunk (state=3): >>><<< 11044 1726853267.94214: stderr chunk (state=3): >>><<< 11044 1726853267.94237: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853267.94251: handler run complete 11044 1726853267.94280: Evaluated conditional (False): False 11044 1726853267.94308: attempt loop complete, returning result 11044 1726853267.94310: _execute() done 11044 1726853267.94312: dumping result to json 11044 1726853267.94375: done dumping result, returning 11044 1726853267.94378: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [02083763-bbaf-c5a6-f857-000000000570] 11044 1726853267.94380: sending task result for task 02083763-bbaf-c5a6-f857-000000000570 11044 1726853267.94457: done sending task result for task 02083763-bbaf-c5a6-f857-000000000570 ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008760", "end": "2024-09-20 13:27:47.896342", "rc": 0, "start": "2024-09-20 13:27:47.887582" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:3a:e7:40:bc:9f brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.45.153/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 3181sec preferred_lft 3181sec inet6 fe80::3a:e7ff:fe40:bc9f/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.153 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.153 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 11044 1726853267.94737: no more pending results, returning what we have 11044 1726853267.94741: results queue empty 11044 1726853267.94745: checking for any_errors_fatal 11044 1726853267.94746: done checking for any_errors_fatal 11044 1726853267.94747: checking for max_fail_percentage 11044 1726853267.94749: done checking for max_fail_percentage 11044 1726853267.94749: checking to see if all hosts have failed and the running result is not ok 11044 1726853267.94750: done checking to see if all hosts have failed 11044 1726853267.94751: getting the remaining hosts for this loop 11044 1726853267.94753: done getting the remaining hosts for this loop 11044 1726853267.94756: getting the next task for host managed_node1 11044 1726853267.94762: done getting next task for host managed_node1 11044 1726853267.94765: ^ task is: TASK: Verify DNS and network connectivity 11044 1726853267.94769: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11044 1726853267.94781: getting variables 11044 1726853267.94783: in VariableManager get_vars() 11044 1726853267.94827: Calling all_inventory to load vars for managed_node1 11044 1726853267.94830: Calling groups_inventory to load vars for managed_node1 11044 1726853267.94833: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853267.94848: Calling all_plugins_play to load vars for managed_node1 11044 1726853267.94852: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853267.94856: Calling groups_plugins_play to load vars for managed_node1 11044 1726853267.95388: WORKER PROCESS EXITING 11044 1726853267.96405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853267.98089: done with get_vars() 11044 1726853267.98116: done getting variables 11044 1726853267.98190: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 13:27:47 -0400 (0:00:00.393) 0:00:32.357 ****** 11044 1726853267.98223: entering _queue_task() for managed_node1/shell 11044 1726853267.98696: worker is 1 (out of 1 available) 11044 1726853267.98708: exiting _queue_task() for managed_node1/shell 11044 1726853267.98719: done queuing things up, now waiting for results queue to drain 11044 1726853267.98720: waiting for pending results... 11044 1726853267.98915: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 11044 1726853267.99053: in run() - task 02083763-bbaf-c5a6-f857-000000000571 11044 1726853267.99079: variable 'ansible_search_path' from source: unknown 11044 1726853267.99086: variable 'ansible_search_path' from source: unknown 11044 1726853267.99119: calling self._execute() 11044 1726853267.99222: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853267.99232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853267.99248: variable 'omit' from source: magic vars 11044 1726853267.99714: variable 'ansible_distribution_major_version' from source: facts 11044 1726853267.99733: Evaluated conditional (ansible_distribution_major_version != '6'): True 11044 1726853267.99947: variable 'ansible_facts' from source: unknown 11044 1726853268.00890: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 11044 1726853268.00901: variable 'omit' from source: magic vars 11044 1726853268.00954: variable 'omit' from source: magic vars 11044 1726853268.00996: variable 'omit' from source: magic vars 11044 1726853268.01049: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11044 1726853268.01094: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11044 1726853268.01118: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11044 1726853268.01147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853268.01163: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11044 1726853268.01202: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11044 1726853268.01235: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853268.01238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853268.01329: Set connection var ansible_timeout to 10 11044 1726853268.01349: Set connection var ansible_shell_executable to /bin/sh 11044 1726853268.01356: Set connection var ansible_shell_type to sh 11044 1726853268.01421: Set connection var ansible_module_compression to ZIP_DEFLATED 11044 1726853268.01424: Set connection var ansible_connection to ssh 11044 1726853268.01426: Set connection var ansible_pipelining to False 11044 1726853268.01428: variable 'ansible_shell_executable' from source: unknown 11044 1726853268.01430: variable 'ansible_connection' from source: unknown 11044 1726853268.01432: variable 'ansible_module_compression' from source: unknown 11044 1726853268.01434: variable 'ansible_shell_type' from source: unknown 11044 1726853268.01436: variable 'ansible_shell_executable' from source: unknown 11044 1726853268.01438: variable 'ansible_host' from source: host vars for 'managed_node1' 11044 1726853268.01440: variable 'ansible_pipelining' from source: unknown 11044 1726853268.01442: variable 'ansible_timeout' from source: unknown 11044 1726853268.01457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11044 1726853268.01605: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853268.01621: variable 'omit' from source: magic vars 11044 1726853268.01629: starting attempt loop 11044 1726853268.01669: running the handler 11044 1726853268.01674: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11044 1726853268.01685: _low_level_execute_command(): starting 11044 1726853268.01696: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11044 1726853268.02346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853268.02350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853268.02377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853268.02381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853268.02480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853268.02586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853268.04662: stdout chunk (state=3): >>>/root <<< 11044 1726853268.04665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853268.04668: stdout chunk (state=3): >>><<< 11044 1726853268.04670: stderr chunk (state=3): >>><<< 11044 1726853268.04676: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853268.04679: _low_level_execute_command(): starting 11044 1726853268.04682: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853268.0456023-12579-270169242773105 `" && echo ansible-tmp-1726853268.0456023-12579-270169242773105="` echo /root/.ansible/tmp/ansible-tmp-1726853268.0456023-12579-270169242773105 `" ) && sleep 0' 11044 1726853268.05735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11044 1726853268.05759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853268.05774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853268.05793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853268.05811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 11044 1726853268.05868: stderr chunk (state=3): >>>debug2: match not found <<< 11044 1726853268.05964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853268.05988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853268.06064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853268.08245: stdout chunk (state=3): >>>ansible-tmp-1726853268.0456023-12579-270169242773105=/root/.ansible/tmp/ansible-tmp-1726853268.0456023-12579-270169242773105 <<< 11044 1726853268.08249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853268.08251: stdout chunk (state=3): >>><<< 11044 1726853268.08254: stderr chunk (state=3): >>><<< 11044 1726853268.08274: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853268.0456023-12579-270169242773105=/root/.ansible/tmp/ansible-tmp-1726853268.0456023-12579-270169242773105 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853268.08678: variable 'ansible_module_compression' from source: unknown 11044 1726853268.08683: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1104467doc9gy/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11044 1726853268.08686: variable 'ansible_facts' from source: unknown 11044 1726853268.08988: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853268.0456023-12579-270169242773105/AnsiballZ_command.py 11044 1726853268.09354: Sending initial data 11044 1726853268.09385: Sent initial data (156 bytes) 11044 1726853268.10548: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853268.10717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853268.10773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853268.10888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853268.12496: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11044 1726853268.12509: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11044 1726853268.12603: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11044 1726853268.12668: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1104467doc9gy/tmp6if1h7ru /root/.ansible/tmp/ansible-tmp-1726853268.0456023-12579-270169242773105/AnsiballZ_command.py <<< 11044 1726853268.12673: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853268.0456023-12579-270169242773105/AnsiballZ_command.py" <<< 11044 1726853268.12708: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1104467doc9gy/tmp6if1h7ru" to remote "/root/.ansible/tmp/ansible-tmp-1726853268.0456023-12579-270169242773105/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853268.0456023-12579-270169242773105/AnsiballZ_command.py" <<< 11044 1726853268.13907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853268.14221: stderr chunk (state=3): >>><<< 11044 1726853268.14225: stdout chunk (state=3): >>><<< 11044 1726853268.14248: done transferring module to remote 11044 1726853268.14258: _low_level_execute_command(): starting 11044 1726853268.14264: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853268.0456023-12579-270169242773105/ /root/.ansible/tmp/ansible-tmp-1726853268.0456023-12579-270169242773105/AnsiballZ_command.py && sleep 0' 11044 1726853268.15477: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11044 1726853268.15482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11044 1726853268.15624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 11044 1726853268.15633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853268.15647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853268.15799: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853268.15865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853268.17992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853268.17996: stdout chunk (state=3): >>><<< 11044 1726853268.18001: stderr chunk (state=3): >>><<< 11044 1726853268.18121: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853268.18124: _low_level_execute_command(): starting 11044 1726853268.18128: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853268.0456023-12579-270169242773105/AnsiballZ_command.py && sleep 0' 11044 1726853268.19293: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853268.19337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853268.19352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853268.19380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853268.19440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853268.61237: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 2315 0 --:--:-- --:--:-- --:--:-- 2328\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2676 0 --:--:-- --:--:-- --:--:-- 2694", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:27:48.345906", "end": "2024-09-20 13:27:48.611500", "delta": "0:00:00.265594", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11044 1726853268.62855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 11044 1726853268.62881: stderr chunk (state=3): >>><<< 11044 1726853268.62885: stdout chunk (state=3): >>><<< 11044 1726853268.62903: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 2315 0 --:--:-- --:--:-- --:--:-- 2328\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2676 0 --:--:-- --:--:-- --:--:-- 2694", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:27:48.345906", "end": "2024-09-20 13:27:48.611500", "delta": "0:00:00.265594", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 11044 1726853268.62940: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853268.0456023-12579-270169242773105/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11044 1726853268.62949: _low_level_execute_command(): starting 11044 1726853268.62952: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853268.0456023-12579-270169242773105/ > /dev/null 2>&1 && sleep 0' 11044 1726853268.63538: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11044 1726853268.63545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11044 1726853268.63551: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 11044 1726853268.63579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11044 1726853268.63583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11044 1726853268.63647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11044 1726853268.65501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11044 1726853268.65518: stdout chunk (state=3): >>><<< 11044 1726853268.65528: stderr chunk (state=3): >>><<< 11044 1726853268.65551: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11044 1726853268.65564: handler run complete 11044 1726853268.65593: Evaluated conditional (False): False 11044 1726853268.65607: attempt loop complete, returning result 11044 1726853268.65726: _execute() done 11044 1726853268.65729: dumping result to json 11044 1726853268.65731: done dumping result, returning 11044 1726853268.65733: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [02083763-bbaf-c5a6-f857-000000000571] 11044 1726853268.65735: sending task result for task 02083763-bbaf-c5a6-f857-000000000571 11044 1726853268.65814: done sending task result for task 02083763-bbaf-c5a6-f857-000000000571 11044 1726853268.65818: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.265594", "end": "2024-09-20 13:27:48.611500", "rc": 0, "start": "2024-09-20 13:27:48.345906" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 2315 0 --:--:-- --:--:-- --:--:-- 2328 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 2676 0 --:--:-- --:--:-- --:--:-- 2694 11044 1726853268.65898: no more pending results, returning what we have 11044 1726853268.65902: results queue empty 11044 1726853268.65903: checking for any_errors_fatal 11044 1726853268.65913: done checking for any_errors_fatal 11044 1726853268.65914: checking for max_fail_percentage 11044 1726853268.65916: done checking for max_fail_percentage 11044 1726853268.65917: checking to see if all hosts have failed and the running result is not ok 11044 1726853268.65918: done checking to see if all hosts have failed 11044 1726853268.65919: getting the remaining hosts for this loop 11044 1726853268.65920: done getting the remaining hosts for this loop 11044 1726853268.65924: getting the next task for host managed_node1 11044 1726853268.65935: done getting next task for host managed_node1 11044 1726853268.65946: ^ task is: TASK: meta (flush_handlers) 11044 1726853268.65948: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853268.65953: getting variables 11044 1726853268.65955: in VariableManager get_vars() 11044 1726853268.65999: Calling all_inventory to load vars for managed_node1 11044 1726853268.66002: Calling groups_inventory to load vars for managed_node1 11044 1726853268.66004: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853268.66015: Calling all_plugins_play to load vars for managed_node1 11044 1726853268.66017: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853268.66020: Calling groups_plugins_play to load vars for managed_node1 11044 1726853268.68251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853268.69372: done with get_vars() 11044 1726853268.69389: done getting variables 11044 1726853268.69442: in VariableManager get_vars() 11044 1726853268.69455: Calling all_inventory to load vars for managed_node1 11044 1726853268.69457: Calling groups_inventory to load vars for managed_node1 11044 1726853268.69458: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853268.69461: Calling all_plugins_play to load vars for managed_node1 11044 1726853268.69463: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853268.69464: Calling groups_plugins_play to load vars for managed_node1 11044 1726853268.70094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853268.71292: done with get_vars() 11044 1726853268.71324: done queuing things up, now waiting for results queue to drain 11044 1726853268.71327: results queue empty 11044 1726853268.71327: checking for any_errors_fatal 11044 1726853268.71332: done checking for any_errors_fatal 11044 1726853268.71332: checking for max_fail_percentage 11044 1726853268.71333: done checking for max_fail_percentage 11044 1726853268.71334: checking to see if all hosts have failed and the running result is not ok 11044 1726853268.71335: done checking to see if all hosts have failed 11044 1726853268.71335: getting the remaining hosts for this loop 11044 1726853268.71336: done getting the remaining hosts for this loop 11044 1726853268.71339: getting the next task for host managed_node1 11044 1726853268.71346: done getting next task for host managed_node1 11044 1726853268.71348: ^ task is: TASK: meta (flush_handlers) 11044 1726853268.71349: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853268.71352: getting variables 11044 1726853268.71353: in VariableManager get_vars() 11044 1726853268.71369: Calling all_inventory to load vars for managed_node1 11044 1726853268.71373: Calling groups_inventory to load vars for managed_node1 11044 1726853268.71376: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853268.71381: Calling all_plugins_play to load vars for managed_node1 11044 1726853268.71384: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853268.71387: Calling groups_plugins_play to load vars for managed_node1 11044 1726853268.72179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853268.73015: done with get_vars() 11044 1726853268.73032: done getting variables 11044 1726853268.73072: in VariableManager get_vars() 11044 1726853268.73082: Calling all_inventory to load vars for managed_node1 11044 1726853268.73084: Calling groups_inventory to load vars for managed_node1 11044 1726853268.73085: Calling all_plugins_inventory to load vars for managed_node1 11044 1726853268.73089: Calling all_plugins_play to load vars for managed_node1 11044 1726853268.73090: Calling groups_plugins_inventory to load vars for managed_node1 11044 1726853268.73092: Calling groups_plugins_play to load vars for managed_node1 11044 1726853268.73802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11044 1726853268.75355: done with get_vars() 11044 1726853268.75377: done queuing things up, now waiting for results queue to drain 11044 1726853268.75379: results queue empty 11044 1726853268.75379: checking for any_errors_fatal 11044 1726853268.75380: done checking for any_errors_fatal 11044 1726853268.75381: checking for max_fail_percentage 11044 1726853268.75381: done checking for max_fail_percentage 11044 1726853268.75382: checking to see if all hosts have failed and the running result is not ok 11044 1726853268.75382: done checking to see if all hosts have failed 11044 1726853268.75383: getting the remaining hosts for this loop 11044 1726853268.75384: done getting the remaining hosts for this loop 11044 1726853268.75386: getting the next task for host managed_node1 11044 1726853268.75388: done getting next task for host managed_node1 11044 1726853268.75389: ^ task is: None 11044 1726853268.75390: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11044 1726853268.75390: done queuing things up, now waiting for results queue to drain 11044 1726853268.75391: results queue empty 11044 1726853268.75391: checking for any_errors_fatal 11044 1726853268.75392: done checking for any_errors_fatal 11044 1726853268.75392: checking for max_fail_percentage 11044 1726853268.75393: done checking for max_fail_percentage 11044 1726853268.75393: checking to see if all hosts have failed and the running result is not ok 11044 1726853268.75394: done checking to see if all hosts have failed 11044 1726853268.75395: getting the next task for host managed_node1 11044 1726853268.75397: done getting next task for host managed_node1 11044 1726853268.75397: ^ task is: None 11044 1726853268.75398: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=76 changed=3 unreachable=0 failed=0 skipped=60 rescued=0 ignored=0 Friday 20 September 2024 13:27:48 -0400 (0:00:00.772) 0:00:33.130 ****** =============================================================================== Install dnsmasq --------------------------------------------------------- 2.18s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 1.99s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.78s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 1.72s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 ** TEST check polling interval ------------------------------------------ 1.35s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:75 Gathering Facts --------------------------------------------------------- 1.29s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:6 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.27s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 1.19s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.08s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.95s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.93s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Verify DNS and network connectivity ------------------------------------- 0.77s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Check which packages are installed --- 0.73s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.69s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather the minimum subset of ansible_facts required by the network role test --- 0.68s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Install pgrep, sysctl --------------------------------------------------- 0.68s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.60s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Check if system is ostree ----------------------------------------------- 0.53s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Remove test interfaces -------------------------------------------------- 0.46s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Stat profile file ------------------------------------------------------- 0.45s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 11044 1726853268.75496: RUNNING CLEANUP